Now I have a local Linux server running Backuppc (various Windows and Mac folder collected from the lan). My LAN (router) has no open port apart internet service, no port forwarding.
I’m quite secure right now?
Now I’d like to make a remote copy of this backup in another location (home?)
Where is the most secure scenario to protect my lan and my data?
a) Create a VPN to transfer this data to a NAS or another configured linux server at home?
b) Create a VPN to transfer this data to a REST, HTTPs, S3 etc… configured server at home?
The restic push method is most secure then backuppc pull method to avoid lan intrusion is exposed over wan, right?
Please suggest me a detailed guide for the best scenario.
Sorry for my english and my newbie questions
Security is highly relative. There’s no such thing as “being secure”, unfortunately. It all boils down to what threat level you are up against, and how well you are protected against the specific types of threats you can envision, as well as what routines you have in place to deal with a successful attack (this last point is often more important than trying to be completely secure).
This is too broad of a question to answer in general. It depends on many different factors.
A VPN would make it harder for an attacker to gain access to the traffic between your current backup server and the second/remote backup server. This will normally increase security, but might not be a requirement for you, assuming you use already protected channels for the backups (e.g. sending backups over HTTPS to rest-server and using strong authentication, certificates and even whitelisting for the access). It also does make things more complicated, which can weaken security (e.g. due to human errors). I wouldn’t consider a VPN necessary.
Same option as above, the difference is just the protocol you use for the communication between the current backup server and the second/remote one. Regarding that protocol, one that uses encryption and is well established and simple to manage is probably to be favored. HTTPS with rest-server isn’t a bad choice, but S3 would use certificates and encryption as well, so…
The important thing is that you have at least one copy of your data in another location, such that if there’s a severe flooding or other catastrophy, or even just the local house burning to the ground, you always have at least one copy of the data restorable. Preferrably you keep two copies in totally different locations (besides the local one I mean).
It’s also a good idea to spread things out on different technologies/platforms at least on the storage side (e.g. instead of using two remote B2 or S3 storages, use one of both).
You should also consider things like “if an attacker gets access to the computer that is sending my backups offsite, will he be able to delete my backups?”. Usually it’s a “yes” to that. You can use a remote backup server with e.g. rest-server running in --append-only mode, or you can use a remote backup server which uses e.g. the ZFS filesystem and makes local snapshots of the filesystem every now and then, such that if an attacker were to delete the entire repository on that backup server remotely, you will at least be able to restore to the last local snapshot that the remote server itself made of its filesystem (which can of course be synced with the backups, so you always have the last backup available).
Another thing that could be relevant to you is how you can restore. It would be preferable to have a setup where you don’t have to restore first from restic and then from the result of that use backuppc to restore the files you are really after. It’s nicer if you can just restore in one single step, depending on just one single software (ideally one that is cross-platform and just a binary that you can run off the back, instead of having to install things).
Both push and pull has its pros and cons
As you can see, there’s a lot of factors to consider when talking about security and redundancy. If you want to you can elaborate and we’ll try to be a little more specific in terms or recommendations. For example, what are the main things you want to protect against?
First of all thank you very much for your detailed answer!
Now I’ve 2 windows PCs and two Macs in my LAN.
Every workstation have its own usb external disk with automatic incremental backup: AOMEI backupper for PC, Time machine for Mac.
Important documents are sometime backupped manually on a WD cloud 4TB in LAN.
Therefore every backup resides in my office.
Now I’d like to backup part of my system (some folders or the entire workstations?) to a remote place.
I wouldn’t open any dangerous doors to my LAN!
I’d like to do remote backup with minimum possible risk of intrusion in my LAN
(now I only access internet, without any port forwarding to service in my router or firewall).
I care about privacy
If I well understand, the use of push technology like restic, well configured, in case of external attack, it doesn’t cause a break in the LAN, but “ONLY” the possibile access to the crypted backup data, right?
I prefer choose the data to be saved on remote server (with minor privacy value) that will be attacked on my LAN with the risk of system corruption of my workstations.
Right, your main concern is to not have to allow incoming connections from the Internet into your LAN. That is good thinking, and you do not need to do that in order to back your computers up using restic.
So you currently have each computer backing up itself onto an external USB disk. I think it would be a good complement if you simply used restic on each of those computers to back them up to an external repository as well. Then you will have both an on-site backup and an offsite backup. They will be stored on different media, which is good, and there will be different software doing it which is good too (although I wouldn’t want to have more than three softwares involved).
An option to backing up each computer using restic directly to some off-site server is to back them up using restic to a local server, e.g. some NAS. Then you have the option to sync the backup repositories from that NAS onto some external server if you want. The point of this would be to have somewhat faster local backups (to the NAS), but it also does add a bit more complexity albeit very little in the general picture.
I think you need to realize that your idea of things being secure is probably a bit overestimated Things aren’t as secure as you seem to think they are. Your local network isn’t “secure” just because you don’t allow incoming traffic. There’s so much more involved in security, and it’s all highly relevant in the end. Not allowing incoming traffic from the Internet through to your LAN makes your network more secure, but not secure.
That said, if you want to sync files from your NAS to some other storage, rclone will probably do a great job with that, and it can handle a lot of different backends/services. And rclone can also act as the REST server for restic to use as backend, which is sweet. Then again, you might want to run rest-server on the NAS/local server anyway, so that you can have separate users and passwords for all your computers (so they can’t touch each others’ files).
Did you already have a local server that you wanted to back up your clients to? Note that it’s not a requirement - you can have all your computers back up to a remote server directly if you want. Having a local server that you let the computers back up to and that you then sync to a remote server will just make it somewhat easier to do maintenance on the repositories.
In either case, keep it simple and maintainable - that increases security too
Hi Rawtaz, I undestand that good security in general is difficult to achieve. But I’d like to do the best possible practice.
“Did you already have a local server that you wanted to back up your clients to?”
Now I’ve a local linux server with bacckuppc, but It’ not indispensable, it was just a try.
My goal will be to backup this server (or other local server containing the various clients backup) with restic to a remote server (a new server on my lan at home? An online service?)
Which type of server can i configure by myself? sftp, rest server? Which is better for security purpose? Or online service are always better?
This seems unnecessary to me. Why would you back up your clients to a local server, and then a second time back those backups up to a remote server? I recommend just backing up your clients to the local server with restic, and then using rclone or similar to sync those backups to a remote server.
There are endless options here. Depends on how much data it is and other things.
You could rent a VPS or other type of server that you run rest-server on and back your clients up to directly (without local server). Or you could have that local server use rclone to sync to Amazon S3 or any other backend rclone supports.
There is no such thing as “always better”, it all comes down to your needs and requirements.
Really, you have to start boiling it down to what you want to aim for, what’s options are useful for your use case. Look into the various backends restic support, and look into what backends rclone support, then start working towards an idea of what you would like to store your backups on.
If it’s of any help, here’s a few examples of what I do:
In one place I back up my clients (Macs) to two external/remote servers on the Internet (separate backup runs, no syncing going on). Those servers run rest-server, so restic backs up directly to them from each client.
In another place I back up all clients (Macs) to a local server running rest-server, and this is the only backup I have of those clients.
In third place I back up all clients (Macs and Windows PCs) to the same server as I mentioned in the previous list item, but over the Internet (so yes, at the place where that server resides, incoming connections to it from the Internet are allowed, but it’s not done by regular port forwarding, the server is on a completely separate network of its own).
Very interesting, thanks!
“In third place I back up all clients (Macs and Windows PCs) to the same server as I mentioned in the previous list item, but over the Internet (so yes, at the place where that server resides, incoming connections to it from the Internet are allowed, but it’s not done by regular port forwarding, the server is on a completely separate network of its own).”
Can you better details this situation? I’d like to implement a similar one.
What about “but it’s not done by regular port forwarding, the server is on a completely separate network of its own”?
Which gude can i follow to implement a rest server on my home? And to use restic to backup my client from my office lan to that home lan over internet?
I’ve two different router with two different internet access and two different static IP in my office, and one router with one static ip at home.
Here is the section in the manual that talks about how to specify the repository URL for a rest-server. Besides that you can read the rest of the manual on that same website. There’s also a lot of guides on using restic in general on the Internet, just look for them and you will find it.
If you have further questions, please be more specific. This entire thread is a long list of me answer to unspecific questions. If you want more detail, explain what in particular you want more details about, otherwise I don’t know what you’re asking for.
I suggest you just start trying restic out, and why not rest-server as well. Just play with it, you’ll see how to use it.