I have a proxmox+Debian+docker server and I’m looking to setup my backups so that they get backed up (DUH) on my Linux PC whenever it comes online on the local network.

I’m not sure if what’s best is backing up locally and having something else handling the copying, how to have those backup run only if they haven’t run in a while regardless of the availability of the PC, if it’s best to have the PC run the logic or to keep the control over it on the server.

Mostly I don’t want to waste space on my server because it’s limited…

I don’t know the what and I don’t know the how, currently, any input is appreciated.

  • dwindling7373OP
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    I will probably start with this approach and see where it leads me, thanks!

    • tvcvt@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      Since you’re interested in this kind of DIY, approach, I’d seriously consider thinking the whole process through and writing a simple script for this that runs from your desktop. That will make it trivial to do an automatic backup whenever you’re active on the network.

      Instead of cron, look into systemd timers and you can fire off your script after, say, one minute of being on your desktop, using a monotonic timer like OnUnitActiveSec=60.

      Thinking through the script in pseudo code, it could look something like:

      rsync -avzh $server_source $desktop_destination || curl -d "Backup failed" ntfy.sh/mytopic

      This would pull the back from your server to your desktop and, if the backup failed, use a service such as ntfy.sh to notify you of the problem.

      I think that would pretty much take care of all of your requirements and if you ever decided to switch systems (like using zfs send/recv instead of rsync), it would be a matter of just altering that one script.