Repository Corrupted? Error: load <snapshot/xxxxxxx>: invalid data returned

Hi there!

One of my backup destinations started failing recently. Restic complains about a seemingly-corrupted snapshot:

load <snapshot/95392d6075>: invalid data returned

I’ve tried various commands to try and fix the repo, to no avail. Whatever the command I always end up with the same “invalid data returned” error.

What could have caused this to suddenly appear? Is there any way to fix this without creating a new repo and re-uploading everything?

Some things that might help figuring it out:

  • This is an SFTP destination
  • There was a “connection reset” SSH error during backup two days before this error started popping up. What’s weird is that the following backup worked without any issues. So: Day 1, SSH error. Day 2, no error. Day 3, repo seemingly corrupted.

But I suppose that restic is resilient to SSH connection resets and isn’t supposed to corrupt remote repositories in this specific case?

restic check

using temporary cache in C:\Users\Arsenic\AppData\Local\Temp\restic-check-cache-417591663
repository fb5dd416 opened successfully, password is correct                             
created new cache in C:\Users\Arsenic\AppData\Local\Temp\restic-check-cache-417591663    
create exclusive lock for repository                                                     
load indexes                                                                             
check all packs                                                                          
check snapshots, trees and blobs                                                         
error: load <snapshot/95392d6075>: invalid data returned                                 
Fatal: repository contains errors                                                        

restic forget 95392d6075

repository fb5dd416 opened successfully, password is correct                                                                                           
Ignoring "95392d6075ec824472c05842ad5b0611ee41a4e7a91f8b76a9d2db84cfa7ec0b", could not load snapshot: load <snapshot/95392d6075>: invalid data returned

restic prune

repository fb5dd416 opened successfully, password is correct                
counting files in repo                                                      
building new index for repo                                                 
[27:23] 100.00%  75943 / 75943 packs                                        
repository contains 75943 packs (517201 blobs) with 238.464 GiB             
processed 517201 blobs: 0 duplicate blobs, 0 B duplicate                    
load all snapshots                                                          
load <snapshot/95392d6075>: invalid data returned                           
github.com/restic/restic/internal/repository.(*Repository).LoadAndDecrypt   
        /restic/internal/repository/repository.go:96                        
github.com/restic/restic/internal/repository.(*Repository).LoadJSONUnpacked 
        /restic/internal/repository/repository.go:204                       
github.com/restic/restic/internal/restic.LoadSnapshot                       
        /restic/internal/restic/snapshot.go:61                              
github.com/restic/restic/internal/restic.LoadAllSnapshots.func1             
        /restic/internal/restic/snapshot.go:72                              
github.com/restic/restic/internal/repository.(*Repository).List.func1       
        /restic/internal/repository/repository.go:644                       
github.com/restic/restic/internal/backend.(*RetryBackend).List.func1.1      
        /restic/internal/backend/backend_retry.go:133                       
github.com/restic/restic/internal/backend/local.(*Local).List.func1         
        /restic/internal/backend/local/local.go:248                         
path/filepath.walk                                                          
        /usr/local/go/src/path/filepath/path.go:358                         
path/filepath.walk                                                          
        /usr/local/go/src/path/filepath/path.go:382                         
path/filepath.Walk                                                          
        /usr/local/go/src/path/filepath/path.go:404                         
github.com/restic/restic/internal/fs.Walk                                   
        /restic/internal/fs/file.go:99                                      
github.com/restic/restic/internal/backend/local.(*Local).List               
        /restic/internal/backend/local/local.go:219                         
github.com/restic/restic/internal/backend.(*RetryBackend).List.func1        
        /restic/internal/backend/backend_retry.go:127                       
github.com/cenkalti/backoff.RetryNotify                                     
        /restic/vendor/github.com/cenkalti/backoff/retry.go:37              
github.com/restic/restic/internal/backend.(*RetryBackend).retry             
        /restic/internal/backend/backend_retry.go:36                        
github.com/restic/restic/internal/backend.(*RetryBackend).List              
        /restic/internal/backend/backend_retry.go:126                       
github.com/restic/restic/internal/repository.(*Repository).List             
        /restic/internal/repository/repository.go:638                       
github.com/restic/restic/internal/restic.LoadAllSnapshots                   
        /restic/internal/restic/snapshot.go:71                              
main.pruneRepository                                                        
        /restic/cmd/restic/cmd_prune.go:174                                 
main.runPrune                                                               
        /restic/cmd/restic/cmd_prune.go:85                                  
main.glob..func17                                                           
        /restic/cmd/restic/cmd_prune.go:25                                  
github.com/spf13/cobra.(*Command).execute                                   
        /restic/vendor/github.com/spf13/cobra/command.go:762                
github.com/spf13/cobra.(*Command).ExecuteC                                  
        /restic/vendor/github.com/spf13/cobra/command.go:852                
github.com/spf13/cobra.(*Command).Execute                                   
        /restic/vendor/github.com/spf13/cobra/command.go:800                
main.main                                                                   
        /restic/cmd/restic/main.go:86                                       
runtime.main                                                                
        /usr/local/go/src/runtime/proc.go:203                               
runtime.goexit                                                              
        /usr/local/go/src/runtime/asm_amd64.s:1357

There should be a file with that name in snapshots directory. Could you please download it and check that it’s sha256sum is correct (should mtach file name).

If no, then snapshot itself is broken and you probably can’t do anything to completely recover it.

As far as I understand that you prefer to save some traffic and avoid uploading data again. What I would suggest is to remove that file manually and run restic check. But DO NOT RUN PRUNE. Even if you’ve only one snapshot, most of data is already uploaded. If restic check is ok after removing snapshots (maybe only reports something like pack xxx: not referenced in any index) then use rebuild-index and then backup again. restic will be able to detect that certain blobs are already uploaded (even if they are not part of snapshot) and reuse them.

PS. I’m not restic developer so this is just my opinion as restic user.

Interesting!

The file had a size of 0 KB and the hash did not match the filename. I deleted it as you suggested and restic check now passes. I’m in the process of rebuilding the index and will let the usual backup run tonight. We’ll see how it goes.

Do you know if deleting a snapshot file is equivalent to “forgetting” it in restic? Are there subtleties involved?

As far as I know it’s exactly same thing. The only difference is that before removing file restic forget loads it (to check for various --keep-*/--host/--tag policies),

1 Like

Just to conclude this thread, I can confirm that @dionorgua’s proposed fix worked (thanks!). However, how my repo got in this state remains a mystery.