Restic ls - folders only

Here’s an idea. So I have some snapshots that cause Restic to swell up to 32GB of RAM when I try to mount even just that snapshot alone (and I only have 16GB RAM). I wanted an easy way to see what’s in the snapshot, but even dumping “restic ls” to a text file is unwieldy. So I’d like to request a “folders only” option. It would be nice to be able to just “restic ls --folders-only”, to make the output a little more manageable.

If it passes muster here, then I’ll request it on GitHub too. :slight_smile:

EDIT:

Just tried to close everything out I could, and this time it got up to 50GB RAM before Zsh had to kill it. Still wasn’t even able to do a tree -d on the snapshot. :frowning:

That sounds insane. What are the stats of that repo?

I’ll run stats and see. It’ll take a bit.

It also seems to be failing to restore now. It gets so far then I get:

rclone: 2023/02/21 11:53:57 ERROR : locks/47f9c8dbbd70935756ea0ff7fde186f047453ad3e3ebb66cbd67be3b827580b8: Post request put error: Put "https://ohsuitg.sharepoint.com/sites/SM.BaScIT/_api/v2.0/drives/b!puI10iZksUq-hdUMffZzNj0Svxeqg9NBmTgnKVeL894L54CQuqhdSpf9Fql6te6P/items/015I3SJS45INOAOXPYQJAKM33KIWWCZUUH/uploadSession?guid='bb9ef05e-482b-47f4-bb9f-f769a5ccdbec'&overwrite=True&rename=False&dc=0&tempauth=eyJ0eXAiOiJKV1QiLCJhbGciOiJub25lIn0.eyJhdWQiOiIwMDAwMDAwMy0wMDAwLTBmZjEtY2UwMC0wMDAwMDAwMDAwMDAvb2hzdWl0Zy5zaGFyZXBvaW50LmNvbUBlMjczNzk1Ny1mYWI4LTRkN2UtOTRmNi05YmQ2YWY5ZjcxNTgiLCJpc3MiOiIwMDAwMDAwMy0wMDAwLTBmZjEtY2UwMC0wMDAwMDAwMDAwMDAiLCJuYmYiOiIxNjc3MDA4NzIyIiwiZXhwIjoiMTY3NzA5NTEyMiIsImVuZHBvaW50dXJsIjoiZnY1amNiRlMydWgvaDR5b3lrbHZWUFExd2pMMFd1OG9lMmtaY1hmQXJEWT0iLCJlbmRwb2ludHVybExlbmd0aCI6IjI2MiIsImlzbG9vcGJhY2siOiJUcnVlIiwiY2lkIjoiTWpKbVptTTFOalV0WlRJME5DMDBOemhqTFdFd1pEa3RNakF4T1RsaVpqa3pNV1kxIiwidmVyIjoiaGFzaGVkcHJvb2Z0b2tlbiIsInNpdGVpZCI6IlpESXpOV1V5WVRZdE5qUXlOaTAwWVdJeExXSmxPRFV0WkRVd1l6ZGtaalkzTXpNMiIsImFwcF9kaXNwbGF5bmFtZSI6InJjbG9uZSIsImdpdmVuX25hbWUiOiJNYXJrIiwiZmFtaWx5X25hbWUiOiJTbWl0aCBKci4iLCJhcHBpZCI6ImIxNTY2NWQ5LWVkYTYtNDA5Mi04NTM5LTBlZWMzNzZhZmQ1OSIsInRpZCI6ImUyNzM3OTU3LWZhYjgtNGQ3ZS05NGY2LTliZDZhZjlmNzE1OCIsInVwbiI6InNtaXRtYXJrQG9oc3UuZWR1IiwicHVpZCI6IjEwMDMyMDAwN0Q4ODE3N0UiLCJjYWNoZWtleSI6IjBoLmZ8bWVtYmVyc2hpcHwxMDAzMjAwMDdkODgxNzdlQGxpdmUuY29tIiwic2NwIjoibXlmaWxlcy5yZWFkIGFsbGZpbGVzLnJlYWQgbXlmaWxlcy53cml0ZSBhbGxmaWxlcy53cml0ZSBhbGxzaXRlcy5yZWFkIiwidHQiOiIyIiwidXNlUGVyc2lzdGVudENvb2tpZSI6bnVsbCwiaXBhZGRyIjoiMjAuMTkwLjE1NC4xNjAifQ.MmJ5ZGlkZnRKUU43elZpUTgyeDNxOElKeGJ0Z1d0dXg1VDRZYUM0ck9wbz0": stream error: stream ID 93; CANCEL; received from peer

rclone: 2023/02/21 11:53:57 ERROR : locks/47f9c8dbbd70935756ea0ff7fde186f047453ad3e3ebb66cbd67be3b827580b8: Post request rcat error: Put "https://ohsuitg.sharepoint.com/sites/SM.BaScIT/_api/v2.0/drives/b!puI10iZksUq-hdUMffZzNj0Svxeqg9NBmTgnKVeL894L54CQuqhdSpf9Fql6te6P/items/015I3SJS45INOAOXPYQJAKM33KIWWCZUUH/uploadSession?guid='bb9ef05e-482b-47f4-bb9f-f769a5ccdbec'&overwrite=True&rename=False&dc=0&tempauth=eyJ0eXAiOiJKV1QiLCJhbGciOiJub25lIn0.eyJhdWQiOiIwMDAwMDAwMy0wMDAwLTBmZjEtY2UwMC0wMDAwMDAwMDAwMDAvb2hzdWl0Zy5zaGFyZXBvaW50LmNvbUBlMjczNzk1Ny1mYWI4LTRkN2UtOTRmNi05YmQ2YWY5ZjcxNTgiLCJpc3MiOiIwMDAwMDAwMy0wMDAwLTBmZjEtY2UwMC0wMDAwMDAwMDAwMDAiLCJuYmYiOiIxNjc3MDA4NzIyIiwiZXhwIjoiMTY3NzA5NTEyMiIsImVuZHBvaW50dXJsIjoiZnY1amNiRlMydWgvaDR5b3lrbHZWUFExd2pMMFd1OG9lMmtaY1hmQXJEWT0iLCJlbmRwb2ludHVybExlbmd0aCI6IjI2MiIsImlzbG9vcGJhY2siOiJUcnVlIiwiY2lkIjoiTWpKbVptTTFOalV0WlRJME5DMDBOemhqTFdFd1pEa3RNakF4T1RsaVpqa3pNV1kxIiwidmVyIjoiaGFzaGVkcHJvb2Z0b2tlbiIsInNpdGVpZCI6IlpESXpOV1V5WVRZdE5qUXlOaTAwWVdJeExXSmxPRFV0WkRVd1l6ZGtaalkzTXpNMiIsImFwcF9kaXNwbGF5bmFtZSI6InJjbG9uZSIsImdpdmVuX25hbWUiOiJNYXJrIiwiZmFtaWx5X25hbWUiOiJTbWl0aCBKci4iLCJhcHBpZCI6ImIxNTY2NWQ5LWVkYTYtNDA5Mi04NTM5LTBlZWMzNzZhZmQ1OSIsInRpZCI6ImUyNzM3OTU3LWZhYjgtNGQ3ZS05NGY2LTliZDZhZjlmNzE1OCIsInVwbiI6InNtaXRtYXJrQG9oc3UuZWR1IiwicHVpZCI6IjEwMDMyMDAwN0Q4ODE3N0UiLCJjYWNoZWtleSI6IjBoLmZ8bWVtYmVyc2hpcHwxMDAzMjAwMDdkODgxNzdlQGxpdmUuY29tIiwic2NwIjoibXlmaWxlcy5yZWFkIGFsbGZpbGVzLnJlYWQgbXlmaWxlcy53cml0ZSBhbGxmaWxlcy53cml0ZSBhbGxzaXRlcy5yZWFkIiwidHQiOiIyIiwidXNlUGVyc2lzdGVudENvb2tpZSI6bnVsbCwiaXBhZGRyIjoiMjAuMTkwLjE1NC4xNjAifQ.MmJ5ZGlkZnRKUU43elZpUTgyeDNxOElKeGJ0Z1d0dXg1VDRZYUM0ck9wbz0": stream error: stream ID 93; CANCEL; received from peer

unable to refresh lock: server response unexpected: 500 Internal Server Error (500)

Fatal: failed to refresh lock in time

rclone: 2023/02/21 11:54:34 ERROR : data/41/4139a99364a80dba2e0d7bcbc58ef6b75121ed4bf26733af3d81273edae2e7f6: Didn't finish writing GET request (wrote 8921885/135045103 bytes): context canceled

rclone: 2023/02/21 11:54:34 ERROR : data/71/71d7e6f69c1fbaff4a5732e82e560649ccbe9e37592b3131ba26c28720893474: Didn't finish writing GET request (wrote 107999285/134929217 bytes): context canceled

Would running it with GOGC=20 possibly alleviate the memory issues? I think the error above is likely unrelated and just SharePoint timing out, but still…

At the moment I’m doing: restic copy || restic copy || restic copy || etc in an attempt to restore since “restore” won’t actually work. It will copy about 50 out of 1762 packs (it’s about a 300GB snapshot) then fail. Luckily “copy” can actually resume, so if I run it it enough times, in theory I should get it locally. This will work for the 300GB snapshot but I’m going to have a time of it with the ~6TB snapshot. :frowning:

Restic stats is still churning, but here’s SharePoint’s stats:


I use a 128M chunk size, which I was hoping would cut down on SharePoint issues but… I’d say I need a larger chunk size than that, even…

Stats in restore-size mode:
Snapshots processed: 234
Total File Count: 173676679
Total Size: 90.947 TiB

Stats in raw-data mode:
Snapshots processed: 234
Total Blob Count: 70152613
Total Uncompressed Size: 16.866 TiB
Total Size: 8.805 TiB
Compression Progress: 100.00%
Compression Ratio: 1.92x
Compression Space Saving: 47.79%

Well this is interesting. I have a LOCAL copy of the 300GB snapshot, and I’m getting this error now:

repository f186139f opened (version 2, compression level max)
restoring <Snapshot 8f07e1e5 of [/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse /Volumes/Padlock_DT/OHSUMacBackUp_04292022afterUse] at 2022-06-16 11:39:14.830481 -0700 PDT by smitmark@RJHB595> to /Volumes/042122_DL_Backup2
Fatal: failed to refresh lock in time

It looks like local or remote both, it’s having trouble saving the lock file.

For the local repo:

Stats in restore-size mode:
Snapshots processed: 6
Total File Count: 381596
Total Size: 796.597 GiB

Stats in raw-data mode:
Snapshots processed: 6
Total Blob Count: 669543
Total Uncompressed Size: 599.856 GiB
Total Size: 360.736 GiB
Compression Progress: 100.00%
Compression Ratio: 1.66x
Compression Space Saving: 39.86%

It’s a 1TB NVMe chip. Pretty swift. I’m running restic 0.15.1 compiled with go1.19.5 on darwin/arm64.

After awhile, it continued, and I got a ton of errors like this:

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina122_NovogeneNGS_150PE_M53-138/M55-84_Illumina122_RawDataFiles/M54-89_S0_L001_R2_001.fastq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina122_NovogeneNGS_150PE_M53-138/M55-84_Illumina122_RawDataFiles/M54-89_S0_L001_R2_001.fastq.gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/M26-124_Illumina80_NovogeneNGS_PE150read60M/raw_data/M26_125_2.fq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/M26-124_Illumina80_NovogeneNGS_PE150read60M/raw_data/M26_125_2.fq.gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina122_NovogeneNGS_150PE_M53-138/M55-84_Illumina122_RawDataFiles/M54-89_S0_L001_R1_001.fastq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina122_NovogeneNGS_150PE_M53-138/M55-84_Illumina122_RawDataFiles/M54-89_S0_L001_R1_001.fastq.gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina91_OHSUMPSSR_NextSeq75single_Kazu/LIB190731HN/Undetermined_S0_R1_001.fastq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina91_OHSUMPSSR_NextSeq75single_Kazu/LIB190731HN/Undetermined_S0_R1_001.fastq.gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina91_OHSUMPSSR_NextSeq75single_Kazu/LIB190731HN/Undetermined_S0_R1_001.fastq (1).gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina91_OHSUMPSSR_NextSeq75single_Kazu/LIB190731HN/Undetermined_S0_R1_001.fastq (1).gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina91_OHSUMPSSR_NextSeq75single_Kazu/LIB190731HN/Undetermined_S0_R1_001.fastq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina91_OHSUMPSSR_NextSeq75single_Kazu/LIB190731HN/Undetermined_S0_R1_001.fastq.gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina91_OHSUMPSSR_NextSeq75single_Kazu/LIB190731HN/Undetermined_S0_R1_001.fastq (1).gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina91_OHSUMPSSR_NextSeq75single_Kazu/LIB190731HN/Undetermined_S0_R1_001.fastq (1).gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina82_NovogeneNGS_PE150read60M/RawData/M27_59_2.fq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina82_NovogeneNGS_PE150read60M/RawData/M27_59_2.fq.gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina82_NovogeneNGS_PE150read60M/RawData/M27_59_2.fq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina82_NovogeneNGS_PE150read60M/RawData/M27_59_2.fq.gz: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina94_OHSUMPSSR_NextSeq75single/M36-17_Illumina94Sam_PeptideExt_go133_2OUTPUT.txt: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina94_OHSUMPSSR_NextSeq75single/M36-17_Illumina94Sam_PeptideExt_go133_2OUTPUT.txt: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina94_Sam_PeptideExt/M36-17_Illumina94Sam_PeptideExt_go133_2OUTPUT.txt: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina94_Sam_PeptideExt/M36-17_Illumina94Sam_PeptideExt_go133_2OUTPUT.txt: no such file or directory

ignoring error for /Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina84_NovogeneNGS_PE150read250M_Sunghee/M27_93_1.fq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina84_NovogeneNGS_PE150read250M_Sunghee/M27_93_1.fq.gz: no such file or directory

Wait, I’m confused by these error messages…

“/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina84_NovogeneNGS_PE150read250M_Sunghee/M27_93_1.fq.gz: open /Volumes/042122_DL_Backup2/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina84_NovogeneNGS_PE150read250M_Sunghee/M27_93_1.fq.gz: no such file or directory”

The command I ran was restic restore 8f07e1e5 --verify -t /Volumes/042122_DL_Backup2

The repo is located on “/Volumes/Fortress_L3/db”

The original paths of the snapshot are:
/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse
/Volumes/Padlock_DT/OHSUMacBackUp_04292022afterUse

Does this look right? Should the error be mentioning “/Volumes/Padlock_DT/BoxSyncBackUp_04292022afterUse/Experiments/IlluminaSequencing/Illumina84_NovogeneNGS_PE150read250M_Sunghee/M27_93_1.fq.gz” as if it’s trying to open it from that path? Or is that just the file it’s having trouble with?

Maybe the path is too long? Wasn’t there some limit like 256 chars on Windows or something?!

Hmm maybe. Though it was an HFS+ volume backed up on a Mac.

I’m still trying to get this thing to restore. I did a big prune job to lighten the load on the SharePoint server and on my RAM. I’ll post here if I ever figure anything out.

Update: I had to copy this snapshot to an empty local repository. Once I did that, I could both restore it, and mount it - though it did take several days to restore, and that was from a 4TB NVMe chip in a Thunderbolt 3 enclosure, to an brand new external WD HDD. I’m guessing the repo is just too big for my current hardware (8TB repo, 16GB RAM).

That said I still think the original feature request would be useful. Much like “tree” can do “tree -d” and only output directories, I think restic ls would benefit from the same option.

We don’t track feature requests in the forum, please open an issue on Github.

1 Like