[Support] ich777 - Application Dockers


ich777

Recommended Posts

On 9/5/2022 at 5:55 AM, ich777 said:

I've tried it now on my server and over here this is not the case.

Have you changed anything in the template?

 

Only the web interface port as it conflicted with another docker container.

 

Its weird I literally went to login and set-up a stream for tonight and once again all my settings have reverted to default.

 

Stream key, logos, video settings.. all default.

 

Strange thing is I dont think I have changed or updated the container since the last time i used it.

Link to comment
3 minutes ago, enigma27 said:

Strange thing is I dont think I have changed or updated the container since the last time i used it.

Is your appdata share set to use cache only or prefer in the Share settings?

If not I could imagine that your mover moves the files to the array but it is searching it on the cache and couldn‘t find the files so they are reverted, that‘s my best guess…

Link to comment

Hello,

 

Is there a way for luckyBackup to run multiple threads/backups at once? I have around 18 TB that I want to backup to another unraid server. Currently I have a backup running, but it would seem that it's currently going to take almost 11 days to finish. I'd also like to take the primary server offline for some hardware changes, as my power supply fan seems to be failing.

Link to comment
8 hours ago, enigma27 said:

my appdata folder is set to cache only

Then I really don‘t know what the issue is. Please double check every setting…

You can also try to download a fresh copy from the CA App and change only the port so that it doesn‘t interfere with other containers and try if it works there.

 

As said before, over here everything is working just fine and all settings are preserved like they should be.

Link to comment
6 hours ago, slughappy1 said:

Is there a way for luckyBackup to run multiple threads/backups at once?

No, you can multiple instances from luckyBackup to sync different folders but it‘s not able to run them in parallel.

 

However when you are using the schedule you should be able to set up multiple syncs to start at the same time (don‘t forget to tick the box ConsoleMode).

Link to comment

hi there just a minor issue. for files that have been synced down onto a shared folder, the files are showing as rw and thats it instead of rw rw rw. so when i access the share in windows, i can't open any of the files.

 

I guess I have to chmod 777 the folder and files?

 

any chance of a fix or what might be causing this issue?

Link to comment
1 hour ago, ich777 said:

What permissions do the files have when you upload them?

Do you run the container with UID 99, GID 100 and UMASK 000?

the files are being downloaded from mega onto the unraid server and it is being ran with uid 99, gid 100, and umask 000. the files are not corrupted or anything as i can view them fine in the mega app

Edited by Goldmaster
Link to comment
31 minutes ago, Goldmaster said:

the files are not corrupted or anything as i can view them fine in the mega app

I‘ve initially designed the container to upload files to Mega.

Anyways, isn‘t there an option within the Mega app to set the permissions or am I mistaken?

You can also try rclone from the CA App if it‘s the same there too.

Link to comment
10 minutes ago, ich777 said:

Anyways, isn‘t there an option within the Mega app to set the permissions or am I mistaken?

yes there is, so not sure if group permissions for folders are meant to have read, write and execution ticked and files for owner set to execution as well?

 

I think all the tick boxes are ment to be ticked so that the numeric value for folders and files is 777?image.thumb.png.7627d22ce5810bb2d27f3ea571e20a22.png

Edited by Goldmaster
Link to comment
14 hours ago, ich777 said:

No, you can multiple instances from luckyBackup to sync different folders but it‘s not able to run them in parallel.

 

However when you are using the schedule you should be able to set up multiple syncs to start at the same time (don‘t forget to tick the box ConsoleMode).

So it's inherently single threaded, but when using a schedule I could have it start multiple simultaneous single threaded backups and effectively get multi-thread? Is Console Mode required to be used because it executes without the need of the GUI? Otherwise only one can run because only one GUI exists?

Link to comment
1 minute ago, slughappy1 said:

Is Console Mode required

Please read the description from the container again, it is required for a schedule because luckyBackup was not designed to run in a Docker container...

 

2 minutes ago, slughappy1 said:

Otherwise only one can run because only one GUI exists?

Exactly.

Link to comment
20 minutes ago, ich777 said:

Please read the description from the container again, it is required for a schedule because luckyBackup was not designed to run in a Docker container...

 

Exactly.

Not sure why I didn't connect CRON with making a schedule....but that now makes sense.

 

Is there any reason why I couldn't use the rsync command that it came up with, use a few terminals, and execute the command more that once? Then when it all finished, use luckyBackup as the main scheduled backup going forward?

  • Terminal 1
    • rsync -h --progress --stats -r -tgo -P -| -D --update --protect-args -e "ssh -i /root/.ssh/id_rsa -p 22" /mnt/user/share1 [email protected]:/mnt/user/share1
  • Terminal 2
    • rsync -h --progress --stats -r -tgo -P -| -D --update --protect-args -e "ssh -i /root/.ssh/id_rsa -p 22" /mnt/user/share2 [email protected]:/mnt/user/share2
  • Terminal 3
    • rsync -h --progress --stats -r -tgo -P -| -D --update --protect-args -e "ssh -i /root/.ssh/id_rsa -p 22" /mnt/user/share3 [email protected]:/mnt/user/share3
Link to comment
3 minutes ago, ich777 said:

No, but why then use luckyBackup?

You could easily do this from Unraid itself too if you want to.

Fair point. I think what I want to do is to use luckyBackup after I make the initial backup. As I tend to forget exact commands, or to actually run the backups. Seeing the container, and setting up a schedule, would probably work better for me than if I try to do it myself going forward. Thanks for your quick and helpful responses @ich777

  • Like 1
Link to comment

Hello.  I am having an issue with DirSyncPro.  Things were working fine the last time I checked (early August), but when I try to run the Analyze step I get an error saying the source folder doesn't exist.  Sure enough, if I modify the job to re-pick the source folder I can browse down a few levels to /sourcefiles/Nexcloud/nexcloud but there are no files or folders beyond that.  However, there actually are files and sub-folders there and I've verified that not only with Nextcloud, but with the terminal command line.  The folder I want to select is: /sourcefiles/Nextcloud/nextcloud/data/username1/files/InstantUpload/Camera

Within that folder are all of my recent photos synced using Nextcloud.  Why can't DirSyncPro see these folders anymore?  I double checked all of my docker settings but can not figure out what needs to change as it had been working fine for months if not years.

Link to comment
4 hours ago, zero_koop said:

Why can't DirSyncPro see these folders anymore?

Where is the share located on the host?

 

4 hours ago, zero_koop said:

I double checked all of my docker settings but can not figure out what needs to change as it had been working fine for months if not years.

Nothing has changed and the continer wasnt updated either.

Link to comment
20 hours ago, ich777 said:

Where is the share located on the host?

My docker settings are:

Container Path: /sourcefiles

Host Path: /mnt/user/

So basically DirSyncPro can see all of my shares including the share named Nextcloud. And it can see a single folder level down into that share until it stops strangely.

 

But if you are asking if the share is located on the cache or not I have "use cache pool" set to "no".

 

Also of potential note is: I have never been able to use Windows Explorer to browse the contents of the "nextcloud" folder on the "Nextcloud" share.  I've always received a "you do not have permission to access" error.  I wonder if this permission issue, which never affected DirSyncPro is now affecting the docker.

Edited by zero_koop
completeness
Link to comment
5 hours ago, zero_koop said:

I wonder if this permission issue, which never affected DirSyncPro is now affecting the docker.

If that's the case it wouln't have worked before too...

 

It is maybe possible that it is related but I'm not too sure about that if it stops working out of nowhere when it was working before, have you yet tried to point to the path directly in the template and change it in DirSyncPro too?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.