I’ve been thinking on exploring some ways to store backups in a cold storage without me having to manually set everything up every month or so.
I thought about having an external drive (or DAS) connected to a raspberry pi that turns on periodically, runs a script and then shuts down.
The alternatives are to manually connect the external drive to my truenas machine and then run the script, unmounting and removing it.
Another options is to ditch the coldness altogether and just create another pool for backup only either in the truenas pool or another truenas for backup only, but this will increase energy costs and degrage the disks unnecessarily for an operation that will be only ran once a month.
What’s the general strategy people do regarding a local backup in another machine in the same geo-location?
I have tried different way before find what Is the best pattern for me. IMHO a lot depend for every use case, and what can be viable for someone can’t be for someone else (nor for cost and work effort)
So after local replication, USB replication, ecc i realised that build a secondary budget system, where perform weekly replication, was the better choice. I put It literally in my garage, connected to LAN through a powerline… And i just power on, do the replication, power off.
This require few minutes of work, just 1 button to press, plus i have a PC for other tasks (like file sync to SMR disks) just swapping the boot device (and literally, in extreme case of needs, spare parts for the main system )
I use a locally attached single disk ZFS pools and Rsync to it, (could have used replication…). Then when done, put it in an anti-static bag, and put that in a hard shell Seahorse case. If I have a remote place, (office, storage unit, etc…), then I take it, or schedule to take it to that remote place. Otherwise it sits on a shelf in a spare room.
One of the tricky things about backups, is that people have too wide of expectations. For example;
If the backup set won’t fit on a single disk, then is a single pool / group of disks better? Or does the user want the files spread out over multiple, but single disks.
Does the user need easy disaster recovery use? Like using FAT or exFAT instead of ZFS.
How often does it need to run?
Need it off site? If so, manually take off site? Or send it through the Internet to an off-site?
So in practice, all home or small office backups are tailored to that user or small business.
A Raspberry Pi with local storage & powering on as needed sounds fine. Ideally you would use ZFS so you could scrub your backup disk and verify it is good.
I hope you schedule periodic times for scrubs and smart tests.
My experience is that any manual process will fail.
So I use offsite online replication.
A benefit of powering off is that you can’t mess up the backup while it’s powered off.
BUT I would suggest 1) redundancy, ie at least a mirror for the target 2) periodic scrubs and smart tests. That will allow you to repair and detect issues before they get too bad.
Interesting, I thought there would be much different solutions proposed, but I like the idea that each person works with a different backup solution that works for them.
I’ll start up a system and will take the feedback about automating scrubbing and smart testing.
Off course man! At least monthly nor long smart and Scrubs, and the pool Is small but mirrored.
Can’t forget, Google calendar helps me to remember
Plus, I have another copy of data as a simple file Sync, encrypted with bitlocker, on a bunch of small SMR disks that cannot be used with ZFS. The bigger ones getting rotated to my parents home.
All those stuff costs me around 100~120€.
For the moment, the only thing that make me struggle has been the Powerline, had to test a lot of solution before get It works well