Possible to sync a local repo to an external host?

Hi all,

I read this before I created this topic: Move backup between targets
Looks like it was mentioned that the repo “should work out of the box on a new location”.

My usecase is to create a local repository on a NAS/Network located drive with high speed internet. Then I want my repo files (encrypted repository) to be copied an offsite location to ensure the backups aren’t lost for some reason (e.g. if the house burns down). This is because I want to have redundancy.

My question is, will the remotely synced files act like a 1:1 clone of the NAS repository, where I could potentially log in and use restic commands with it?
Or do I need have two repos, where my target server is backed up to two repositories separately?

I hope I managed to explain my use-case well enough.

will the remotely synced files act like a 1:1 clone of the NAS repository, where I could potentially log in and use restic commands with it?

Yes you can, as long as the sync operation succeeds. You can test it with a small repo beforehand, but there is no reason for it to not work if remote one is reachable.

1 Like

Thank you for your reply!

Is restic reliant on having permissions synced as well, or is it enough to have all files readable in case of a recovery? I know rsync with archive flag (-a) can copy permissions, but not sure if that is a requirement?

If you’re merely copying the repo files to another location, be aware that this means you have no better protection against ransomware, bit rot at either end, or the local NAS being exploited. What I tend to do, depending on the hardware available, is backup the files locally as normal, and sync the original files to the other location via rsync/Syncthing, and have a restic run on that remote machine that backs up the synced files themselves.

You could also back up locally, then backup to a write-only rest server on the remote end.

1 Like

For the files inside the restic repo, being readable is enough. The metadata information of the backed-up files are inside these files anyway. But it’s a good tip that @ProactiveServices give, you might want to check the integrity of remote repo with restic check after cloning.

1 Like

Protection against ransomware can be achieved by making another copy of your repo on e.g. an external USB hard drive.
This hard drive is only connected from time to time and, depending on the paranoid level, only under a secure environment, e.g. a live system

Regarding bit rot: IMO, it’s good if you have copy of a file. If one file is changed by bit rot, the other can be used.

Thank you all! This is why I love posting on the forum, one may end up with really useful suggestions.

In regards to ransomware, that is one of the things I want to try to secure myself against. Not that I believe there is an immediate risk, but I handle data that should not be lost, and learning more about “being prepared” has paid off in the past.
I recently did a very big mistake on my server and caused big loss of data. Luckily, restic saved my ass because I had taken precautions :slight_smile: (Very happy about that)

I could as @gurkan mentioned, check the integrity of the repo on my NAS, before I do an offsite backup of those files. It would be sad to sync a broken repo. Would that be enough?
I could also have a timer to disable the network location between x and y hours.

Appreciate all your valuable input!

1 Like

As for checking my repos I tend to have the automated backups perform a regular check once per day and a check --read-data once per week, or a subset each week. The documentation will tell you the difference between the two, but the crux is that --read-data gives you full confidence the repo is completely intact, but is much more time consuming. Glad to hear restic has saved you - it’s easy to make a mistake which results in validation of one’s backup!

1 Like