Error Backing Up

Hello! I’m new here and I have discovered Restic looking for a backup solution for cloud. It is really great, I really like it. I have a script for automatic hourly backups. I was using REST Server for Google Drive. When the new version 0.9 came out I stopped using rest server and started using rclone backend instead, which is great because I don’t have to have rclone running all the time for the automatic backups to function correctly. But there are times that I get this error while pruning old backups:

counting files in repo rclone: 2018/05/24 19:23:20 ERROR : locks/94ca3f76f7c95e11b073d533d9375ab66ce42b245f33a21c4d3cf1560a77e762: Post request put error: googleapi: Error 403: User Rate Limit Exceeded, userRateLimitExceeded Save(<lock/94ca3f76f7>) returned error, retrying after 589.675813ms: server response unexpected: 500 Internal Server Error (500) [6:16] 100.00% 1032 / 1032 packs

It seems to work tho; the script finish with all done. I’ve mount the backup using fuse and it’s all in there. I’m using Ubuntu 18.04, Restic version 0.9 and my cloud is Google Drive. I know Google Drive have like a limit of 750GB or something like that (I read about it somewhere) but my backup is just like 4GB, is not much, I’m just backing up documents to Google Drive. I think this has something to do with Rclone and is not Restic but I just wanted to ask if there’s something wrong with this error. It seems to be working like I said, and until now I haven’t had any problems, it’s just that error. Also, I have seen this error while running check but I haven’t seen it in a while now; now is just when pruning. Thanks in advance for your help.

Sounds like Google is rate limiting you because you make too many transactions in a short amount of time. You could try to tweak rclone by adjusting --transfers and --checkers.

1 Like

Thank you for your response. The weird thing about this is that this “error” appears just sometimes. Most of the time the backup succeed just fine without errors. Then, let’s say it already did 5 backups, on the sixth backup appears the message and on the number 8 it doesn’t appear again. The files changes always are like a few MiB’s so I don’t really get it. At the end of every backup it seems to do everything fine and I’ve already checked mounting the backups and running check and it doesn’t give me any errors. Can you point me to the right direction in terms of adjusting --transfers and --checkers? I’m sorry to ask but to be honest I don’t have any idea on how to do that. I’ve searched but due to my lack of comprehension on this subject, I can’t really know what I’m looking for. Thanks again.

Unfortunately I don’t have any experience with Google Drive, hence I won’t be of great help on that matter.

I’ve used rclone in the past when syncing some files to a SFTP server and run into some rate limiting there. After setting --transfers=2 and --checkers=4 (half of the default) the issue went away.

Thank you for your response. So, I’ve waited a lot and still I have the same problem. Sometimes it takes a lot to finish one job. I have a script and a cron job that take care of the process so I don’t have to do it all manually. I’ve configured so it sends me an email with the output every time it does a backup. The script contains the backup, some flags for excluding, check and prune older backups using --keep-monthly, weekly, etc. It works great tho, it’s just only the Limit Exceeded problem. So, I’ve read in the documentation that you can configure Rclone with variables like export RCLONE_BWLIMIT=1M. I’m trying to understand… sorry if the question is dumb but that means that you can then configure any variable? If I want to set the --transfers=2 then I could do export RCLONE_TRANSFERS='2' in the same script?

Thanks again. Any help is really appreciated.

I havn’t tried this myself so can’t tell for sure. But from my understanding of the documentation this looks correct.

Not a big problem. Rclone will hit the quota limit of the Google Drive API, and that’s causing this 403 error. After this error, rclone will automatically back off and lower the number of API calls. That’s why the backup completes successfully.

More info in this: https://forum.rclone.org/t/sending-data-to-gdrive-403-rate-limit-exceeded/3469

@764287 Thank you for taking your time to answer. I tried it but it does the same thing. I got some new errors but that I fixed using rebuild-index and then check --read-data because apparently I was missing some files or something like that but the backup is there. Thank you, again.

@Cadish Great! Thank you for your answer. I supposed from the beginning that it was okay because I’ve verified the backups and the files are there but I wasn’t really sure if everything was working as it is supposed to work. Thanks again!