Server initiating the backup with clients that have no restic configuration pre-installed

I’ve had an idea in my mind for the last few years, and I lacked the time to work on it.

Now after a few years of on-and-off experimentation, I think I got something interesting.

At the start, I wanted a machine where its backup cannot be traced back after being compromised. I know we have append-only with rest-server but you can still browse and view files that you probably shouldn’t.

So here’s my idea, with a fully working PoC. I’m calling the manager the server that will kick off the backups:

  • The manager needs access to each client via SSH (for now, but it can also be something else)
  • The manager connects automatically to a client via SSH
  • The manager redirects an internal port with SSH tunnelling to an internal web server (on the manager)
  • The manager then runs a binary on the client (it could already be there, or just downloaded)
  • The launched binary will download the files from the manager web server (it could also download via unix pipes inside SSH I guess, but that’s a start)
  • That’s where it gets interesting: the client will put these files into a new FUSE virtual filesystem in memory. NO configuration and/or password file will be saved to disk on the client.
  • Now we can launch restic using the files on the FUSE mount
  • At this point we can also provide a port redirection inside SSH to access the backup server. Doesn’t need to be though.
  • Once finished, we unmount the virtual disk and wipe the files from memory (scrambling is not yet in the PoC)
  • We close the SSH connection => no trace of the backup (no command line written in the shell history file)

Here’s an example of running the init command remotely (to the local disk, I know!). Manager is on macOS whereas the machine that runs the init backup is on Linux:

% resticprofile -v -c examples/private/remotes.yaml send openssh init
2026/03/30 22:36:31 resticprofile 0.33.0-dev compiled with go1.26.1 darwin/arm64
2026/03/30 22:36:31 loading: examples/private/remotes.yaml
2026/03/30 22:36:31 files in configuration are relative to "examples/private"
2026/03/30 22:36:31 memory available: 7728MB
2026/03/30 22:36:31 using restic 0.18.1
2026/03/30 22:36:31 send: this command is experimental and its behaviour may change in the future
2026/03/30 22:36:31 running command: /usr/bin/ssh -f -M -N -S /var/folders/yn/xqk1tr8527q75jh_y134tpjc0000gn/T/rp-ssh1850895886/ssh.sock vps01
2026/03/30 22:36:31 file server listening locally on 127.0.0.1:50733
2026/03/30 22:36:31 running command: /usr/bin/ssh -S /var/folders/yn/xqk1tr8527q75jh_y134tpjc0000gn/T/rp-ssh1850895886/ssh.sock -O forward -R 0:localhost:50733 vps01
2026/03/30 22:36:31 port 35855 opened in tunnel
2026/03/30 22:36:31 running command: /usr/bin/ssh -t -t -S /var/folders/yn/xqk1tr8527q75jh_y134tpjc0000gn/T/rp-ssh1850895886/ssh.sock vps01 /home/user/resticprofile/resticprofile -v -r http://localhost:35855/configuration/openssh
2026/03/30 22:36:31 sending configuration for "openssh"
2026/03/30 22:36:31 file examples/linux.yaml: written 4325 bytes
2026/03/30 22:36:31 file examples/key: written 25 bytes
2026/03/30 22:36:31 file examples/excludes: written 9 bytes
2026/03/30 22:36:31 manifest written 126 bytes
2026/03/30 22:36:32 resticprofile 0.33.0-dev compiled with go1.26.1 linux/arm64
2026/03/30 22:36:32 downloading file linux.yaml (4325 bytes)
2026/03/30 22:36:32 downloading file key (25 bytes)
2026/03/30 22:36:32 downloading file excludes (9 bytes)
2026/03/30 22:36:32 downloading manifest (126 bytes)
2026/03/30 22:36:32 using configuration file from manifest: "linux.yaml"
2026/03/30 22:36:32 mounting filesystem at /tmp/resticprofile-3269189331
2026/03/30 22:36:32 loading: linux.yaml
2026/03/30 22:36:32 memory available: 3545MB
2026/03/30 22:36:32 setting process group priority to 10
2026/03/30 22:36:32 setting IO priority class to 3, level 7
2026/03/30 22:36:32 using restic 0.18.1
2026/03/30 22:36:32 profile 'self': starting 'init'
2026/03/30 22:36:32 command environment: reusing previous
2026/03/30 22:36:32 starting command: /usr/bin/restic init --password-file=key --repo=/tmp/backup/self --verbose=1
created restic repository 8a4a6760e3 at /tmp/backup/self

Please note that knowledge of your password is required to access
the repository. Losing your password means that your data is
irrecoverably lost.
2026/03/30 22:36:35 profile 'self': finished 'init'
2026/03/30 22:36:35 unmounting filesystem
Shared connection to 127.0.0.1 closed.
2026/03/30 22:36:35 running command: /usr/bin/ssh -S /var/folders/yn/xqk1tr8527q75jh_y134tpjc0000gn/T/rp-ssh1850895886/ssh.sock -O exit vps01
Exit request sent.

So now, here are my questions:

  • does it look secure enough?
  • anything that can go wrong during the process?

And yes, the manager is the single point of failure. So is the backup server though :laughing:

1 Like

@creativeprojects interesting and something that has been discussed in one form or another here and on github. Basically you change the push-backup into a pull-backup action via ssh. Did you modify resticprofile for this or can it be done out of the box??

FYI - i do something similar but with native restic and no mounting. I have restic installed on the machine that need to be backed up, then i ssh into it with a reverse tunnel from the backup host to make the rest-server-repo and password available. If this is 100% secure I do not know but once running you cannot see the backup or environmental variables in the client processes. For me it was more for convenience than security.

Here the basic code I use:
ssh -R 1234:localhost:8888 user@client_to_be_backed_up "export RESTIC_REPOSITORY=rest:http://localhost:1234/restic.repo && export RESTIC_PASSWORD=your_password && restic backup /path/to/data/to/be/backed/up"

Is it secure? that depends on your threat model. I can fully understand my command and not your instructions with the scrambled /var/folders and sockets parts but that is me.

Anything that can go wrong? what if the network connection get interrupted after the mount is done, will it leave traces or gracefully die?

Did you over engineer it? Again, I am more of an end user and not a DevOps engineer so you are better judge of that. 8-)

Yes, a few things were added: it’s the same binary running as the client and as the server (manager).
So I’ve added the local web server to transfer files, the FUSE mount to make all files available on the client, and from there it’s the same usual resticprofile creating the restic command from the configuration files (on the mounted virtual disk).

It certainly does look good enough to me; since you can do everything using environment variables

I haven’t checked if ssh is going to kill the running command if the connection is broken. If it does, it’s probably going to be a graceful shutdown.