[Support] binhex - Syncthing


Recommended Posts

Overview: Support for Docker image arch-syncthing in the binhex repo.

Application: Syncthing https://syncthing.net/

Docker Hub: https://hub.docker.com/r/binhex/arch-syncthing/

GitHub: https://github.com/binhex/arch-syncthing

Documentationhttps://github.com/binhex/documentation

 

If you appreciate my work, then please consider buying me a beer 😁
 

btn_donate_SM.gif

 

For other Docker support threads and requests, news and Docker template support for the binhex repository please use the "General" thread here

  • Like 2
Link to comment
5 minutes ago, dhstsw said:

looks like the path for /media isn't mapped correctly in the container.

it looks correct to me, what makes you think that?

 

6 minutes ago, dhstsw said:

It doesn't store files in /media but somewhere else (the container image files get filled quite quickly).

this image does not store any media, it is a sync utility, so its for syncing media between devices, so there should be no media copied to this container, do you have an example of this happening?

Link to comment
  • 1 month later...

Thanks for supporting syncthing. Looks like the latest update broke something I have two different unraid servers and syncthing broke when I updated them on both servers.
I tried pulling latest and no luck.

The logs show:

today at 10:55 PM 2021-04-06 22:55:27,064 DEBG fd 11 closed, stopped monitoring <POutputDispatcher at 23178730679744 for <Subprocess at 23178730781904 with name syncthing in state STARTING> (stdout)>

today at 10:55 PM 2021-04-06 22:55:27,064 DEBG fd 15 closed, stopped monitoring <POutputDispatcher at 23178730448352 for <Subprocess at 23178730781904 with name syncthing in state STARTING> (stderr)>

today at 10:55 PM 2021-04-06 22:55:27,064 INFO exited: syncthing (exit status 1; not expected)

today at 10:55 PM 2021-04-06 22:55:27,064 DEBG received SIGCHLD indicating a child quit

today at 10:55 PM 2021-04-06 22:55:28,065 INFO gave up: syncthing entered FATAL state, too many start retries too quickly


Just curious if you have any ideas. I did look around the interwebs but didn't find anyone having this issue.

Link to comment
3 hours ago, jbat66 said:

Thanks for supporting syncthing. Looks like the latest update broke something I have two different unraid servers and syncthing broke when I updated them on both servers.
I tried pulling latest and no luck.

The logs show:

today at 10:55 PM 2021-04-06 22:55:27,064 DEBG fd 11 closed, stopped monitoring <POutputDispatcher at 23178730679744 for <Subprocess at 23178730781904 with name syncthing in state STARTING> (stdout)>

today at 10:55 PM 2021-04-06 22:55:27,064 DEBG fd 15 closed, stopped monitoring <POutputDispatcher at 23178730448352 for <Subprocess at 23178730781904 with name syncthing in state STARTING> (stderr)>

today at 10:55 PM 2021-04-06 22:55:27,064 INFO exited: syncthing (exit status 1; not expected)

today at 10:55 PM 2021-04-06 22:55:27,064 DEBG received SIGCHLD indicating a child quit

today at 10:55 PM 2021-04-06 22:55:28,065 INFO gave up: syncthing entered FATAL state, too many start retries too quickly


Just curious if you have any ideas. I did look around the interwebs but didn't find anyone having this issue.

there was a bug in a recent release, a quick following update fixed this, so please do a 'check for updates' and then 'update all' and this should get it working.

Link to comment
4 hours ago, binhex said:

there was a bug in a recent release, a quick following update fixed this, so please do a 'check for updates' and then 'update all' and this should get it working.

I thought I would try a force update this morning and it fixed it... Then I came here to let you know, and looks like you pushed the fix. Thanks so much, Have a great day!!!

 

  • Like 1
Link to comment

i thinking about switching from resilio is there a way to prioritize syncs. i have 3 computers 1 has and upload and download speed more than double the other 2. so is there a way for the other two to first send to the fast connection before they send to eachother this would speed syncing up alott. 

Link to comment

@jbat66 Is the sync between your unraid servers still working fine? I tried it and it seems like as long as i create the files that should be synced its working fine. But as soon as i copy a file in the SMB share that syncthing is watching, the whole sync process get stuck at 0%.

Link to comment
  • 3 weeks later...
On 4/12/2021 at 4:33 AM, nicksphone said:

i thinking about switching from resilio is there a way to prioritize syncs. i have 3 computers 1 has and upload and download speed more than double the other 2. so is there a way for the other two to first send to the fast connection before they send to eachother this would speed syncing up alott. 

 Yes you can setup the sync links the way you want. You can say Computer B is the fast computer. then setup sync from A <-> B, C <-> B, you don't setup A <-> C, that way it all goes through B.

Link to comment
On 4/15/2021 at 7:09 AM, slize said:

@jbat66 Is the sync between your unraid servers still working fine? I tried it and it seems like as long as i create the files that should be synced its working fine. But as soon as i copy a file in the SMB share that syncthing is watching, the whole sync process get stuck at 0%.

Yes mine is working Great.  Make sure your underlying permissions are correct on the files as well as the dockers. My dockers are running as UMASK:000, PUID:99, PGID:100. I hope that helps.

 

  • Like 1
Link to comment
On 5/4/2021 at 2:48 AM, jbat66 said:

Yes mine is working Great.  Make sure your underlying permissions are correct on the files as well as the dockers. My dockers are running as UMASK:000, PUID:99, PGID:100. I hope that helps.

 

I just found the issue. Very strange but it seems like that a WireGuard VPN Connection "Remote Access To Server" is not enough. You need "Server to Server Access" to get Syncthing/rsync deamon to work.

Link to comment
  • 5 weeks later...

Does anyone have an idiot's guide to setting up syncthing on unraid?  I'm hoping to also set it up on my android phone to auto-transfer pictures, but I'm not even really sure how to do any of the settings/setup.  I have the binhex syncthing docker installed on unraid, and also the app installed on my phone, but I'm not sure how to go from there.  In an ideal scenario, it would download via wifi  when I launch the app if a) there are new pictures on the phone (only need 1 way), and 2) it's on my home wifi ssid.  Second best would be if it does it at a specific time, but because of battery draw I'd prefer to not have it always searching for the ssid.  Any thoughts/suggestions/places to go to learn?  Thanks!

  • Like 1
Link to comment
Posted (edited)

Hi.

Two of my devices are not capable of finding each other in the same LAN. (Unraid server and windows box.) It was especially weird, because all other devices had no issues. (these had no firewalls, open ports.)

 

In the end I found out, that this container is missing the announce UDP port 21067 from its configuration and 22000 stays closed. Using nmap from windows and another Linux machine, 22000 is detected as closed.

When I add 21067, it does not show up in the 'docker allocations' (neither do other ports I tested with - on this and other containers), and nmap obviously shows it as closed. Changing between host and bridge network, does not help.

 

root@localhost:# /usr/local/emhttp/plugins/dynamix.docker.manager/scripts/docker run -d --name='binhex-syncthing' --net='host' -e TZ="Europe/Berlin" -e HOST_OS="Unraid" -e 'TCP_PORT_8384'='8384' -e 'TCP_PORT_22000'='22000' -e 'UMASK'='000' -e 'PUID'='99' -e 'PGID'='100' -e 'UDP_PORT_21027'='21027' -e 'UDP_PORT_75'='75' -e 'TCP_PORT_76'='76' -v '/mnt/user':'/media':'rw' -v '/mnt/user/appdata/binhex-syncthing':'/config':'rw' 'binhex/arch-syncthing' ... The command finished successfully!

 

nmap scanning windows shows every interrsting port closed, even if the firewall has rules for syncthing or is disabled completely.

 

Do I do anything wrong? Is this a configuration issue on my machine? A Docker bug?

 

docker ports missing.png

portscan.png

Edited by TheJJJ42
Link to comment

Hi-

 

I'm trying to use syncthing as a backup of the My Documents folder on a PC on the network. I keep getting an error message when I try to start syncthing. I create a backup folder with a path that points to the backup share on my Unraid server (/mnt/user/Backup/PC-001/My Documents). I went into the versioning tab, and set it to 30 days. When I create the folder, syncthing gives me an error about not having access to create the directory?

 

Maybe I'm not using syncthing correctly....but what I was trying to do was have it backup the My Documents folder on the PC to the Unraid share (which is backed up via CrashPlan).

 

Thanks.

Link to comment
On 6/12/2021 at 7:43 PM, propman07 said:

Hi-

 

I'm trying to use syncthing as a backup of the My Documents folder on a PC on the network. I keep getting an error message when I try to start syncthing. I create a backup folder with a path that points to the backup share on my Unraid server (/mnt/user/Backup/PC-001/My Documents). I went into the versioning tab, and set it to 30 days. When I create the folder, syncthing gives me an error about not having access to create the directory?

 

Maybe I'm not using syncthing correctly....but what I was trying to do was have it backup the My Documents folder on the PC to the Unraid share (which is backed up via CrashPlan).

 

Thanks.

 

Make sure you are running syncthing with the correct UMASK/PUID/PGID, and make sure that the folder in the share has the correct permissions.

 

While this may not work for your setup, I'm using UMASK 000, PUID 99, PGID 100.

Another thing you can look at is. binhex-urbackup. It just works! It is like your own Crashplan, but free. You can do file level backups, limit what you want backed up, and can even do image level backups.

 

Link to comment
On 2/25/2021 at 12:22 PM, binhex said:

this image does not store any media, it is a sync utility, so its for syncing media between devices, so there should be no media copied to this container, do you have an example of this happening?

 
 

Nothing wrong with your container but if you use the default path in syncthing itself for adding folders to it will be at '~' AKA '/home/nobody' and will save the data into the container image and fill up the docker.img really quickly. I suggest changing the defaults such that the default path's line up with the mount to the host FS (whether that is making the default be `/home/nobody` instead of `/media` or changing the syncthing config). I in the end not a big deal to someone who knows what they are doing, but did take me some time to figure out as I was hoping defaults would have sane behavior and I wouldn't have to fiddle.

Thanks for making this :)

For newbs that are trying to get this to work, run the docker with default settings then change this in syncthing and make sure if you manually specify patsh you use something that starts with `/media`.

 

image.thumb.png.b4737fcd61e4c83a6a717755d0c29e2b.png

Edited by salotz
added tip on how to fix an issue others are having
Link to comment
  • 4 weeks later...
On 6/15/2021 at 10:42 PM, salotz said:

Nothing wrong with your container but if you use the default path in syncthing itself for adding folders to it will be at '~' AKA '/home/nobody' and will save the data into the container image and fill up the docker.img really quickly. I suggest changing the defaults such that the default path's line up with the mount to the host FS (whether that is making the default be `/home/nobody` instead of `/media` or changing the syncthing config). I in the end not a big deal to someone who knows what they are doing, but did take me some time to figure out as I was hoping defaults would have sane behavior and I wouldn't have to fiddle.

Thanks for making this :)

For newbs that are trying to get this to work, run the docker with default settings then change this in syncthing and make sure if you manually specify patsh you use something that starts with `/media`.

 

image.thumb.png.b4737fcd61e4c83a6a717755d0c29e2b.png

 

@Salotz - Thanks for the tip - I'm a newbie and this specific issue is killing me, I have been unable to get a synced folder on my desktop to save to shares on my array (and vice versa - share to desktop).  So just setting the default path to /media would enable files from a synced folder from my desktop to sync to a share on my array?   

 

For example, a synced folder at C:\Users\Name\Documents\sync from my desktop computer that I sync with my unraid server should save to /mnt/user/sync?  

 

Thanks Salotz...  and Binhex - love your Containers, thanks for everything you do.

 

Thanks!

Link to comment
  • 4 weeks later...
On 6/6/2021 at 3:37 PM, joesstuff said:

Does anyone have an idiot's guide to setting up syncthing on unraid?  I'm hoping to also set it up on my android phone to auto-transfer pictures, but I'm not even really sure how to do any of the settings/setup.  I have the binhex syncthing docker installed

Ditto on this...it would be great if space invader created a video for this!

Link to comment
  • 5 weeks later...

Hi, 

 

I recently downloaded Synchting and have found that it crashes Unraid. With crash, I mean that it locks up everything and that I can't get into the server by going to the IP address or ssh.

 

Initially I opened this thread: 

 

 

and this thread: 

In both cases I first did not know the cause. I have found that my server keeps running just fine with the Binhex Syncthing turned off. But when I turn it on, it crahses the whole server at random times. I do not know what logfiles are needed. 

 

Could anyone please tell me what is needed in order to figure out why this container chrashes the server?

 

 

Link to comment
22 hours ago, workermaster said:

I recently downloaded Synchting and have found that it crashes Unraid. With crash, I mean that it locks up everything and that I can't get into the server by going to the IP address or ssh.

 

im not saying it def is NOT this docker image but i would be surprised it brings down your entire system, especially since you are the first person to record this issue and ive had a fair few downloads of this image (100,000+), i would suspect hardware related issues first, probably syncthing pushing memory usage and thus triggering the issue, you have run a memtest right?.

 

having said all that feel free to attach the log file and i will take a quick look over it, but i doubt it will reveal much, log file is located at /config/supervisord.log

Link to comment
38 minutes ago, binhex said:

 

im not saying it def is NOT this docker image but i would be surprised it brings down your entire system, especially since you are the first person to record this issue and ive had a fair few downloads of this image (100,000+), i would suspect hardware related issues first, probably syncthing pushing memory usage and thus triggering the issue, you have run a memtest right?.

 

having said all that feel free to attach the log file and i will take a quick look over it, but i doubt it will reveal much, log file is located at /config/supervisord.log

Hi, 

 

I also thought it would be very strange if there is something wrong with the container because then it would have been reported by other people, but everytime I let the container run through the night, it crashes. 

 

I took a look at the logfiles. They seem to contain quite a bit of sensitive information. I also see 6 logfiles with the same name. Just another number at the end. All of them report the same line from one of the syncronised folders. I am pretty sure that I added some folders to sync after I started experiencing the crashing but am slowly starting to doubt myself. 

 

I wanted to create the best logfiles for you so I have deleted the container and folder from my cache drive. After that I installed the container again. I have not changed anything in it. I will let it run and we will see if it crashes again this night. If it does not then I will add one folder to sync. Could it be that there is a problem because one of the folders that it needs to sync is over 500.000 files big?

 

The memory running out does not seem likely. The system had plenty of free RAM. There is 32GB in the system (planning on upgrading to 64) of wich 20GB are for a VM, 3.5GB are for another VM and the rest is for Unraid and Dockers. I have also never run a memtest. I am currently running one. I will stop that test later today when I need the server again.

 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.