I’m just in the process of figuring out how I want to set up my Windows home PC backup (which I want to be bootable and as close to the original state as possible after restoring, and efficient in the ways hashed chunk based backups can be), and right now it looks like this is how I’ll be doing it:
C: 500 GB NVMe SSD for Windows: kept as clean as reasonably possible, so usage% is low, but it’s not a clean install.
X: 2TB (soon to be 4TB, maybe) SATA SSD for data: everything that doesn’t install itself on
C: goes on
X: - usage will probably stay over 50% with over-provisioning enabled.
Z: 5TB portable HDD (might replace or supplement with a 4TB portable SSD depending on how I feel) for backups: on this, I want to store as many versions of backups as possible while not wasting time and space with full backups and backup chains of disk images.
The form of backup I have in mind (disregarding 3-2-1 for now) is like this:
C: into two 200GB partitions (second one is
- Automatically and regularly create a disk image of
C: and extra partitions needed to boot, excluding page file, swap file, hibernation file, temporary files, etc, and save on
I:, keeping only one version.
- Automatically back up
Z:, probably using restic.
There are some things I’m not sure about, though:
- Can restic maintain NTFS hardlink structure? As in, two hardlinked filenames get backed up, then when they’re restored, they’re still the same file in two places (assuming they’re restored at the same time, of course)?
- Is there some specific software configuration I can use to make sure that the block structure isn’t altered too much in the disk image, so that restic can efficiently deduplicate its data with the individual files? I figure leaving the disk image uncompressed and unencrypted would probably do the job (it seems to be possible, based on my quick test with Veeam Agent and Duplicacy, but a lot of the data still isn’t deduped), but has anyone done this properly to know for sure?
- I can’t get Veeam Agent to keep one full backup only, so it keeps filling
I:, and it’s also more complex than I need and too un-FOSS-y for my current preferences, so I don’t think I’ll be using that. Any good and simple alternatives that suit my needs?
- Is restic appropriate for this use case?
I would appreciate input on this, and wouldn’t mind an open-ended discussion to find a better strategy if one exists. Thanks!
PLEASE IGNORE THE ABOVE POST!
Since I can’t edit my post anymore, I’ll just reply with what I wanted to edit it down to:
I’m trying to set up my Windows home PC to back up everything so that the system can be restored to more or less the same state it was in at backup time, in the way that a disk image backup would, with the additional advantages that come with hashed chunk based backups like restic’s. I’m aiming to back up as many versions of my system’s state as possible, without having to do full backups at all after the first time, ideally. So: state-complete, version-flexible, space-efficient, time-efficient.
- A full image of the OS partitions including hidden system partitions can easily fit on the same drive, and usually takes less than 3 minutes to perform.
- I don’t have any intermediate space to store an image of my data drive.
- I will very likely be using (NTFS) hard links and symbolic links to organize my media libraries.
My plan is to do image backups of the OS partitions onto the same drive (daily full backups only), and file backups of everything including that image backup (hourly).
- Does anyone here know of any specific disk imaging software that can structure the images to allow restic to achieve very high deduplication rates between disk images and the files on disk? Especially, with the ability to schedule these image backups for specific times, only keeping 1 image at a time?
- Can restic preserve NTFS hard link and symbolic link structure without making copies of files and folders?
Suggest that you use the a tool made for Windows system backup, rather than a tool created to backup files / folders. Macrium Reflect (there’s a free version) does incremental backups and fully supports Windows disks. Use Restic for your data disks.
Well thanks, but that kind of software is what I was talking about, and I wanted to know if there was a particular one that’s known to be able to make uncompressed images that deduplicate highly efficiently with a file backup of the same drive under restic, or between different versions of the image backup, so I can store lots of them using restic’s deduplication mechanism. I might as well try your suggestion though and just see what the numbers are like in the end.
Also, I still need to know how restic deals with NTFS hardlinks and symlinks. Actually, it seems like preserving symbolic links is supported in Windows, but the wording in the manual is more ambiguous for hard links, as they’re only mentioned in the context of other operating systems and FUSE. I’ve tried Googling it a little bit but I couldn’t find a clear answer for Windows. But then again, it can’t be too hard to just try it and see. I should just do that.
Ah, now I read your post properly that makes more sense.
Why do you need daily backups of your OS? I do OS image backups every few months. In practice when I had a disk failure my OS had been installed for 3 years so I decided to reinstall, moving straight to Windows 11. I keep absolutely no data on my OS partition, I have my 1TB SSD partitioned 250GB Windows, 750MB data, plus I have a 4TB SSD.
Macrium run weekly for your OS partition should be sufficient, it’s incremental and will be fairly space efficient. I wouldn’t run that through Restic, I’d just back those images and incrementals up to an offsite disk / cloud.
Data drives I’d backup using Restic. I don’t know how it does with links and such, I rarely use them.
It’s not a mission-critical requirement; I just figure that if I can more frequently back everything up without wasting a lot of time (3 minutes) and space (clever deduplication), perhaps there’s no reason not to do so. And more granular control over which version of the backup I can restore just seems like a good idea, but I could knock it down to weekly and just deal with the smaller variations using file backups.
I suppose the most important thing with the image backup is that once it’s restored, it boots and is recognized by the system as being the same installation, but I just wonder if there’s anything else a file backup can’t cover, like specific settings or application data, which is why I want to do a file backup of the OS drive in addition to the OS image backup, despite already directing as much as I can to my data drive. And if I’m doing both an image backup and a file backup of the same drive, then it’d be best if I could minimize the extra storage requirement by using an imaging program that makes backup files that get chunked by restic the same way as the matching data on the disk.
But idk, maybe I’m trying to find a problem for my solution. I’ll think about it. The best backup is one that actually exists.
I think you’re overthinking this. OS drive contents don’t change often, if you have data on the drive back that up with Restic daily, otherwise weekly or monthly OS partition backups should be fine. If you need to restore from a month old backup most likely you’ll just have to apply a few patches, maybe install one piece of software.
There’s no point doing a file based backup of your OS drive, particularly program files and windows folders. OS and programs must be installed to be useful. Image backups are what you need there.
My suggestion: Macrium Reflect weekly backup of OS (though really I do three monthly), and Restic daily to backup your data - whichever drive it lives on.
Just FYI for anyone coming across this thread: Macrium is not open source and as of 2024 no longer free.
Read the docs for Veeam uncheck Periodical Active Full Backup.
Restic is file backup program.
I make Veeam 2 image backup per month and daily backups of data (emails, docs) with restic.
Okay, for now I’ve decided to just do a weekly OS image backup and an hourly restic backup of everything except the image, without worrying about integrating the image into the chunk base. It’d only save about 100GB max anyway, so too much hassle for very little gain.