Jump to content

unRAID as a rsync target/server


Recommended Posts

Fair enough, I don't think it would be that difficult. An example script would be something like this and you can just switch the mounts/mount points (even using an echo loop if you like):

mkdir -p /mnt/mytransferdir
mount.cifs -o username=mysynologyuser,password=mysynologypasswd,file_mode=0660,dir_mode=0770,vers=2.1  //SynologynameorIP/sharename /mnt/mytransferdir
rsync -avz /mnt/mytransferdir/ /mnt/user/Pictures #or whatever you need
umount /mnt/mytransferdir
# Repeat this with each share you want syncd

Could easily work with an NFS share if that's how you roll and Synology will let you mount on the command line. The thing is that if you can just get one good rsync, then subsequent rsyncs won't presumably need to copy all the data, so even if you can get this to work once, your initial plan might be slow but adequate.

Edited by Delarius
fix the rsync command
Link to comment
  • 5 months later...
  • 3 months later...

There is LuckyBackup, running a GUI for rsync in a docker.
It look a bit old-fashioned, but it let's you organize stuff into multiple profiles and different tasks (1 per folder basically).

 

I am syncing backups between 2 locations (or sometimes even a third) connected via VPN to get an offsite copy. I have a profile for each location's backups, and then I have a task for each folder:
- folder for system partition backups of that location (basically where Veeam, EaseUsBackupToDo, Macrium Reflect or whatever software saves laptop/desktop images to.

- folder for network drive (photos and such)

- folder for data folders I that are "backupped" with syncthing to not make the system backup too big

 

I can even access the shares provided by the rsync deamon on my Synology NAS (which wouldn't work out of the box vice versa, because there is no daemon running on unraid by default).

The config is a bit fiddly and it's not robust (as in trying again if connection breaks). But you can setup shedules. For now I still run it manually.
What rsync has problems with is backing up a big number of small files, so I'm looking into different solutions for photos and documents.
Maybe I'll take a look at Amanda or another "orchestrated" solution.

Edited by wambo
  • Thanks 1
Link to comment
  • 10 months later...

hey all - just tagging along. 
 

I can't believe there isn't an easy way to backup synology nas to an unraid server.


I assumed I could mount my unraid share as a network directory and just backup to it, but nope.

 

Scripts scare the pi$$ out of me so that's a no go.

 

If anyone has found a way to do this, please tag in and let me know.  I've got like 40TB of space on my unraid box just sitting here and a synology that I would like to copy over to it.  

Link to comment
  • 3 weeks later...

I used “Unassigned Devices” and “Unassigned Devices Plus” plugins to mount the Synology DS920+ directly within Unraid. From there I just rsync from one /mnt/ to another. Works pretty well so far and hovers around 70-80Mbs.

Link to comment
13 hours ago, GlassPup said:

I used “Unassigned Devices” and “Unassigned Devices Plus” plugins to mount the Synology DS920+ directly within Unraid. From there I just rsync from one /mnt/ to another. Works pretty well so far and hovers around 70-80Mbs.

Not everyone, including me is super comfortable with the CLI or writing scripts. And for all the things Synology does wrong, one thing they did right IMO is the UI for the hyperbackup. It's so easy to pick and choose a bunch of folders across the NAS and schedule a backup. I used to use Hyperbackup and point it to a QNAP NAS that had an rsync server running. Took all of 2 minutes to get set up.

 

But UR can't be used as a target rsync server, or at least I nor anyone else has figured out how. So all the backups across the network need to be done from within the UR cli, mapping each remote folder one by one.

 

AFAIK there also isn't any built in failure notifications with that, so that would mean scripting out error detection and notification functions, etc. I think you can see where this is going.

 

It would be so much simpler if there was either a built in rsync server that could handle network wide backups as a target or an addon/userscript/docker template that would offer these functions.

 

Alas, I've never been able to find one. I've resorted to getting another Synology and backing them up to each other.

Link to comment
  • 1 month later...

For regular ryncd, not using ssh I've figured this out... for my use case this should be fine as I'm running Tailscale between a DS218j and my UNRAID instance but smarter people than me can weigh in on that.  Limited to a single user best I can tell.  Keeping the secrets file and the rsync.conf outside of the Docker container would be nice but I don't know enough about Docker on how this would be accomplished.  Anyone?  Doing that could mean multiple users I think as you could just edit the files and they'd survive a container upgrade as they'd be in appdata.  On the DiskStation side the "share" shows up as "volume".  This is configurable as per the Docker docs but I'm not sure to what advantage at least for my use case.  Hopefully this helps someone else.  Also, side note, I followed the video below (I can't speak/understand Spanish!) but I was able to setup Diskstation as a VM on UNRAID for testing this following the video.  I haven't done a backup yet of a "Real" Diskstation but I'm able to connect now and proceed through the HyperBackup wizard so I can't see it not working the same as my virtual Diskstation.

 

 

Docker container:

https://github.com/axiom-data-science/rsync-server

image.thumb.png.24be708043a5814599357c31584ff5d0.png 

 

image.png.44e90b6b81d104144ed897da0f411ee6.png

 

Windows Explorer of the rsync folder contents


image.png.d2f3d49ab25d742886c975183b4c69d7.png

 

Console of the rsync docker with contents of the rsyncd.conf and rsyncd.secrets file for reference.

 

image.png.cbbe629a845ccd6dbaeeebb0061c0e14.png

 

Edited by Mr.Grumpy
Link to comment
  • 4 months later...

Sorry to dig up an old thread but wanted to post a reply to the above.  I tried the method from @Mr.Grumpy  all seems to go well up until performing the backup - it seems to start off well with all files being created, and the copy begins then after a short period it will stop and then fail.

 

[Network][docker-daily-pret] Exception occurred while backing up data. (No permission to access the backup destination [192.168.0.41(volume)]. Please check you have the right permission to the backup destination.)

 

I tried this with both the 'root' default user for the container and also after deleting the files a mapped user with the correct PUID and PGID. Has anyone else had success with Hyperbackup?

 

This is going from an actual 1821+ to Unraid 6.12.10

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...