I have to make backups of about 3TB, my option is to make a single backup on Google Drive and have everything in a single repository (I should upload 300GB at a time then I have to empty the Google Drive cache) or create separate repositories for each folder. In practice From the hard disk to the PC I create a backup of a every single folder folder and then I move it to Google Drive. Or if you have other options I will be very happy to hear about them. or can I use mega? I think it’s faster. I won’t restore all the backups but only a folder (depends on what I lose)
sorry for my english
It sounds you plan some crazy of doing It.
Use rclone to connect to Gdrive. And backup everything to it.
Is rclone faster than the gdrive app? im on windows
Hard to tell but for sure you can saturate 1 Gbit connection.
Even if not then you run huge backup only once - then I guess you will have some small delta updates.
Configure rclone to access Gdrive - lets say you call this remote Gdrive
. Make sure you configure your own client_id
Configure restic to use your rclone setup.
restic -r rclone:Gdrive:backupDir init
run your backup:
restic -r rclone:Gdrive:backupDir backup D:/path/to/data
You do not have to finish all backup in one go. It might be even impossible as Gdrive only allows to upload 750 GB per day. You can stop it anytime and continue later. Restic will take care of everything:)