I’ve got a list of about 100 possible .lock files, the presence of which should cause the directory to be skipped. A bit of initial testing indicates that wildcards don’t seem to work, but I’m hoping that maybe there is a syntax I have missed.
sed -e 's/\<[.a-zA-Z0-9.]*[lock]\>//g' filename0.txt > filename.txt
Result:
/home/user/dir1/
/home/user/dir2/
You can then use --exclude-file="/path/to/your/filename.txt" when performing a backup. In restic help says you can use this flag multiple times, so you don’t have to worry if you have a “static” exclusion list elsewhere. If this .lock files are changing, you can do the find thing every time you will run the backup. It is not ideal but you can easily script it and stop worrying about having to do the two steps every time you need to run a backup.
EDIT:
I made this script just to see if it works for me and it does. My test directory contains 6 sub-dirs and 3 of them have .lock files. This is the output of the latest snapshot when I use ls:
repository b9349ed1 opened successfully, password is correct
snapshot 87970e1c of [/home/dj0k3/testdir] filtered by [] at 2019-02-22 11:53:02.466103641 -0500 EST):
/home
/home/dj0k3
/home/dj0k3/testdir
/home/dj0k3/testdir/dir1
/home/dj0k3/testdir/dir3
/home/dj0k3/testdir/dir5
Sub-dirs 2, 4 and 6 contains those lock files. This is the little script used: