Imagine that I’ve backup server (just regular machine with full root access and enough storage). So basically it’s ok to use any opensource software for backend (ssh/sftp, minio, REST, or anything else).
Machine is usually in the same LAN.
Any suggestions, what kind of storage backend to use to get faster backup? Maybe somebody already benchmarked this?
If you have full access to the host the REST Backend is usually the fastest one.
I wonder how the REST backend performs in comparison to a repository in a mounted NFS share for a reasonably powerful backup server. I’d reckon REST would be much fast via WiFi since NFS over wireless is way to slow. But for 1000BaseT or even a 10GBaseT?
At home I reached around 105Mb/sec on my Home NAS with the REST Backend
Haven’t compared to NFS, but I get pretty good speeds with the REST backend compared to SCP – that’s for sure.
I’ve configured REST backend for now. Probably I’ll compare it with NFS (over 1000BaseT ethernet) later.
Is it safe (in terms of data safety) to run certain repository operations using ‘local’ backend directly on backup server (forget+prune), while same repo is still served and probably used over REST?
Judged by some conversations we had on IRC, the answer is: yes, I think it should be safe to use the same repo both locally and via Rest Server.
Though, pay attention to ownership of files. If you run local backup as root user, also run Rest Server as root user. If you use some other user to do backup locally, make sure Rest Server runs under that user.