How often do you run backups on your system?
-
I would but the other side isn't zfs so I went with borg instead
-
I honestly don't have too much to back up, so I run one full backup job every Sunday for different directories I care about. They run a check on the directory and only back up any changes or new files. I don't have the space to backup everything, so I only take the smaller stuff and most important. The backup software also allows live monitoring if I enable it, so some of my jobs I have that turned on since I didn't see any reason not to. I reuse the NAS drives that report errors that I replace with new ones to save on money. So far, so good.
Backup software is Bvckup2, and reddit was a huge fan of it years ago, so I gave it a try. It was super cheap for a lifetime license at the time, and it's super lightweight. Sorry, there is no Linux version.
-
Longest interval is every 24 hours. With some more frequent like every 6 hours or so, like the ones for my game servers.
-
Not as o̸̯̪̳̫͗f̴̨͇̉̉̀ͅt̶̢̩̞̽̾̆ẽ̶̳n̸̩͓̯̼͑̃̀̉ ̶̛̜̘̠̉̍̕a̸̭͆̓̀s̴̙͚̮̣̊ ̷̮̽̀Ị̷̬͓̀̕ ̸̧̨̜̥̄͠s̴̖͈̮̈́̐h̴͚̙̲͒̈́͜o̷̞͂̋ü̴̫̃l̴͕̠̭͓̿ḏ̸̡̿̿
-
No backup for my media. Only redundacy.
For my nextcloud data, anytime i made major changes.
-
Just like the “s” in IoT stands for “security”
-
I use Duplicati for my backups, and have backup retention set up like this:
Save one backup each day for the past week, then save one each week for the past month, then save one each month for the past year.
That way I have granual backups for anything recent, and the further back in the past you go the less frequent the backups are to save space
-
I continuous backup important files/configurations to my NAS. That's about it.
IMO people who redundant/backup their media are insane... It's such an incredible waste of space. Having a robust media library is nice, but there's no reason you can't just start over if you have data corruption or something. I have TB and TB of media that I can redownload in a weekend if something happens (if I even want). No reason to waste backup space, IMO.
-
Absolutely, my backup solution is actually based on BTRFS snapshots. I use btrbk (already mentioned in another reply) to take the snapshots and copy them to another drive. Then a nightly restic job backs up the latest snapshot to B2.
-
That's good. You can also check out btrbk - it's a tool which can take snapshots for you, like Timeshift, but also back them up to somewhere.
-
I classify the data according to its importance (gold, silver, bronze, ephemeral). The regularity of the zfs snapshots (15 minutes to several hours) and their retention time (days to years) on the server depends on this. I then send the more important data that I cannot restore or can only restore with great effort (gold and silver) to another server once a day. For bronze, the zfs snapshots and a few days of storage time on the server are enough for me, as it is usually data that I can restore (build artifacts or similar) or is simply not that important. Ephemeral is for unimportant data such as caches or pipelines.
-
Daily toward all my three locations:
- local on the server
- in-house but on a different device
- offsite
But not all three destinations backup the same amount of data due to storage limitations.
-
Proxmox servers are mirrored zpools, not that RAID is a backup. Replication between Proxmox servers every 15 minutes for HA guests, hourly for less critical guests. Full backups with PBS at 5AM and 7PM, 2 sets apiece with one set that goes off site and is rotated weekly. Differential replication every day to zfs.rent. I keep 30 dailies, 4 weeklys, 24 monthly and infinite annuals.
Periodic test restores of all backups at various granularities at least monthly or whenever I'm bored or fuck something up.
Yes, former sysadmin.
-
It becomes a whole different thing when you yourself are a creator of any kind. Sure you can retorrent TBs of movies. But you can't retake that video from 3 years ago.
I have about 2 TB of photos I took. I classify that as media. -
Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent?
Only you can answer this. How many days worth of data are you prepared to lose?
-
One proxmox server with VMs on mirrored 256G SSDs, which backs up weekly. Backups are also uploaded to cloud with an rclone mount and copied to the larger mergerfs storage pool (2 x 8T data + 1 x 8T parity) mounted to one of the VMs, to prevent backups filling the SSDs. Every so often I do a copy of media directories from the pool to a spare hard drive, which I keep onsite because I'm naughty and this is mostly just movies and TV I don't want to bother downloading again. Before having the cold backup, I did have to download everything again one time after I accidentally broke something, but with gigabit fiber and the VM backups it probably took under a week for sonarr and radarr to fetch everything.
-
I have
- Unraid back up it's USB
- Unraid appears gets backed up weekly by a community applications (CA app backup) and I use rclone to back it up to an old box account (100GB for life..) I did have it encrypted but seems I need to fix that..
- Parity drive on my Unraid (8TB)
- I am trying to understand how to use Rclone to back up my photos to Proton Drive so that's next.
Music and media is not too important yet but I would love some insight
-
Assuming it is on: Daily
-
Maybe for common stuff but some dont want 720p YTS or yify releases.
There are also some releases that don't follow TVDB aired releases (which sonarr requires) and matching 500 episodes manually with deviating names isn't exactly what I call 'fun time'.
Amd there are also rare releases that just arent seeded anymore in that specific quality or present on usenet.So yes: Backup up some media files may be important.
-
Backup all of my proxmox-LXCs/VMs to a proxmox backup server every night + sync these backups to another pbs in another town.
A second proxmox backup every noon to my nas.
(i know, 3-2-1 rule is not reached...)