Restic failing with "panic: runtime error: invalid memory address or nil pointer dereference"

After using restic successfully for a while, restic is now failing to backup on one of my machines. I have no idea what to do. The repo url (sftp) and password are set in the environment. Other machines are still successfully backing up to the same repo. There is no obvious system problem (i.e. nothing in syslog, this is Ubuntu 18.04, plenty of disk and ram). Any help appreciated . . . here is the output:

restic backup --tag systemd.timer --exclude ~/.restic-excludes --exclude=/home/weinberg/Nbody/home/weinberg
repository f913a463 opened successfully, password is correct
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x7b82c8]
goroutine 42 [running]:*Index).ID(0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
/restic/internal/repository/index.go:377 +0x38*Repository).LoadIndex.func5(0x8, 0xe6c288)
/restic/internal/repository/repository.go:467 +0xce*Group).Go.func1(0xc0002eb020, 0xc00030b260)
/restic/vendor/ +0x57
created by*Group).Go
/restic/vendor/ +0x66

The version is:

restic version
restic 0.9.5 compiled with go1.12.4 on linux/amd64

I seem to have repaired the situation by a “restic rebuild-index”. restic reported that it found 4 old files. A “restic check” showed no errors. But the “restic backup” worked after that. I still don’t understand what happened . . . and why it only affected one host.

@mdw, you clearly found a bug caused (or at least triggered) by something wrong in your indexes.
My advice would be for you to try to reproduce it (perhaps by restoring the repo to its previous state if possible) and, if it’s indeed reproducible, open an issue on restic’s github.

Thanks for the reply, @durval.

The error was strictly reproducible when the index was in its previous state. It’s a 300GB repo. While I have a copy of the repo in the failed state, I’m not sure that I can easily provide the developers with a failing instance.

@mdw, you are welcome. And, Who said it was going to be easy? :wink:

Seriously now, you don’t need to provide the developers with a failing instance. If you yourself can access it now, it would suffice for you to preserve it in that state, and then open the issue and follow the developer’s instructions – they will most probably be able to pass you binaries and commands to run to try and track down the problem.

Tracking down and fixing it would contribute to make restic more robust and an even better piece of software than it already is, which would benefit everyone.

Sadly, my rsynced copy seems to not fail. It was from a slightly earlier point in time than the failure (it’s done automatically in the middle of the night).

If this happens again, I’ll rsync before doing and rebuilds.

Thanks for having a look at it, @mdw. A better restic makes the sky bluer, the sun shinier and the grass greener :wink:

I just wanted to add some more context in the hope it might be helpful. I got a similar error, and when I tried to rebuild-index, the repo was locked by a 3 month old lock. I unlocked it, and now I’m running the index rebuild. Here’s the full error

panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xc0000005 code=0x0 addr=0x0 pc=0x7bbdef]

goroutine 53 [running]:*Index).ID(0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
        /restic/internal/repository/index.go:377 +0x3f*Repository).LoadIndex.func5(0xc000000008, 0xe415d8)
        /restic/internal/repository/repository.go:467 +0xd5*Group).Go.func1(0xc0002fce40, 0xc00032c8c0)
        /restic/vendor/ +0x5e
created by*Group).Go
        /restic/vendor/ +0x6d```

This is very interesting, considering it’s been seen before. Did the index rebuild have any effect on this issue?

1 Like

I rebuilt the index, then ran the backup command that caused the original error, and the problem was gone and it backed up successfully.

I’m still seeing this error on 0.9.5. I’ll try upgrading to 0.11.0 to see if it still has the issue.

@onionjake The crash has been fixed in restic 0.9.6, see .

Thanks, sorry for resurrecting an old thread. At least the PR that fixes it will be here in the future for anyone looking!