[Support] ich777 - Application Dockers


ich777

Recommended Posts

1 hour ago, polishprocessors said:

all RAW files of mine

Can you send me such a RAW file (or a few of them if they are not too big - eventually compress them with Winzip or Winrar) via PM so that I can test it?

Link to comment

The devs, due to the errors, think it might be a metadata string issue:
 

-Looks like the photo location can't be created properly due to a broken metadata string. Need to know what it is specifically to provide a fix if needed.

-Might be related to the location details, which our backend returns... so it would be good to test as guessing takes a lot of time :)

 

I'll send you a PM with a RAW file, but any idea what metadata strings might be broken?

  • Thanks 1
Link to comment
11 minutes ago, polishprocessors said:

Ok, something of an update: it might actually just be the specific RAW filetype (Fuji .raf) isn't properly handled by DarkTable and instead needs to be processed by RawTherapee. I've created an issue to have the devs force Fuji .raf files to be processed by RawTherapee instead... https://github.com/photoprism/photoprism/issues/1362

Thank you for the feedback and the creation of the issue itself, much appreciated. :)

 

Please report back what the outcome of this issue is.

  • Like 1
Link to comment
1 hour ago, dam_j said:

 

Are you still giving up ? 🙂

 

No, currently this is on my todo list, as a workaround you could eventually use cron from the host.

Have many projects now.

But this is on my to do list... ;)

Link to comment
47 minutes ago, luk said:

is it possible to use DoH-Server with Nginx Proxy Manger? I run already the NPM and I don't know If I can run both?

Yes that should be possible.

Where are you running into issues?

 

I run it with SWAG and it runs just flawlessly.

 

I run it with a subdomain. Something like dns.yourdomain.net should work just fine. ;)

  • Like 1
Link to comment
On 6/7/2021 at 7:23 PM, ich777 said:

No, currently this is on my todo list, as a workaround you could eventually use cron from the host.

Have many projects now.

But this is on my to do list... ;)

 

Hey, I've been checking this on daily basis.

Could you explain what you meant with "cron from the host"?

Is there a way to make a cron within UnRaid to run the task's within LuckyBackup, or..?

Link to comment
1 hour ago, REllU said:

Could you explain what you meant with "cron from the host"?

Sure thing create a new Cron entry on the host with the following script contents and the preferred execution schedule:

#!/bin/bash
docker exec -i --user luckybackup luckyBackup env DISPLAY=:0 /usr/bin/luckybackup --silent --skip-critical /luckybackup/.luckyBackup/profiles/default.profile

 

A short explanation:

  • docker exec -i --user luckybackup (sends a command to the container as user "luckybackup")
  • luckyBackup (this is the actual name of the container, change it if your container is named differently)
  • env DISPLAY=:0 /usr/bin/luckybackup --silent --skip-critical /luckybackup/.luckyBackup/profiles/default.profile (is the actual command that is passed to the container that it should execute - ATTENTION change the underlined text 'default' at the end to eg: 'myjob' (without quotes) if your job that you created in luckyBackup is named 'myjob')

grafik.png.2024c6748160f55ced0e576f6a590f4e.png

(Please make sure you've run the task once from the GUI otherwise the Cron job will not work)

 

If you have any questions feel free to ask. :)

Link to comment
20 minutes ago, ich777 said:

Sure thing create a new Cron entry on the host with the following script contents and the preferred execution schedule:


#!/bin/bash
docker exec -ti --user luckybackup luckyBackup env DISPLAY=:0 /usr/bin/luckybackup --silent --skip-critical /luckybackup/.luckyBackup/profiles/default.profile > /dev/null 2>&1

 

A short explanation:

  • docker exec ti --user luckybackup (sends a command to the container as user "luckybackup")
  • luckyBackup (this is the actual name of the container, change it if your container is named differently)
  • env DISPLAY=:0 /usr/bin/luckybackup --silent --skip-critical /luckybackup/.luckyBackup/profiles/default.profile > /dev/null 2>&1 (is the actual command that is passed to the container that it should execute - ATTENTION change the underlined text 'default' at the end to eg: 'myjob' (without quotes) if your job that you created in luckyBackup is named 'myjob')

grafik.png.2024c6748160f55ced0e576f6a590f4e.png

(Please make sure you've run the task once from the GUI otherwise the Cron job will not work)

 

If you have any questions feel free to ask. :)

 

Massive thank you for the script!

I feel like I might be missing something here though, should the scrip work through "User Scripts" plugin for UnRaid?

 

I tried to manually run the script, and it doesn't seem to be doing anything as far as I can tell 🤔

I didn't change anything from your script, as my current task within LuckyBackup is named "default" and I haven't changed the container name either.

Link to comment
3 minutes ago, REllU said:

I feel like I might be missing something here though, should the scrip work through "User Scripts" plugin for UnRaid?

Yes, you should create a new script in the User Scripts plugin.

For testing purposes you can run just the full command from the Unraid terminal (not the Container terminal!).

 

3 minutes ago, REllU said:

I tried to manually run the script, and it doesn't seem to be doing anything as far as I can tell 🤔

You will see no output but the Backup should run just fine, this is because it is running silently, but this is basically the same as when you create a Cron entry within luckyBackup.

Make sure that the task was run once.

Link to comment
Just now, ich777 said:

Yes, you should create a new script in the User Scripts plugin.

 

You will see no output but the Backup should run just fine, this is because it is running silently, but this is basically the same as when you create a Cron entry within luckyBackup

 

Yeah, I figured the "silent" part within the script would do that, but still, I should be getting some sort of a log somewhere about this, right?

 

Also, I tried to see within the LuckyBackup, if those backups had been made or not (from the "task" -> "manage backup" options on top, you can see the latest versions of the backups)

Link to comment
Just now, REllU said:

Yeah, I figured the "silent" part within the script would do that, but still, I should be getting some sort of a log somewhere about this, right?

Please read the post from above, slightly modified.

No, no log output since I'm redirecting this to /dev/null...

Remove the '> /dev/null' part and you will get a output.

Link to comment
2 minutes ago, ich777 said:

Please read the post from above, slightly modified.

No, no log output since I'm redirecting this to /dev/null...

Remove the '> /dev/null' part and you will get a output.

 

The log says:
"
Script location: /tmp/user.scripts/tmpScripts/AutoBackup/script
Note that closing this window will abort the execution of this script
the input device is not a TTY
"

Link to comment
Just now, REllU said:

 

The log says:
"
Script location: /tmp/user.scripts/tmpScripts/AutoBackup/script
Note that closing this window will abort the execution of this script
the input device is not a TTY
"

Does this happen when executing from the terminal itself or from the User Scripts, if from the User Scripts change '-ti' to '-i' (always forget about that... :D ).

Link to comment
3 minutes ago, ich777 said:

Does this happen when executing from the terminal itself or from the User Scripts, if from the User Scripts change '-ti' to '-i' (always forget about that... :D ).

 

Aces! That seemed to fix it, and the script seemed to run just fine! :)

And I'm getting a log from it, which is is good for me.

 

Only one issue I'm having now, is that it doesn't seem to mark the last backup within the "task" -> "Manage backups"

 

So in a case of restoring a backup, it doesn't show up in the menu 🤔

Link to comment
3 minutes ago, REllU said:

Only one issue I'm having now, is that it doesn't seem to mark the last backup within the "task" -> "Manage backups"

Try this command:

docker exec -i --user luckybackup luckyBackup env DISPLAY=:0 /usr/bin/luckybackup --silent --skip-critical /luckybackup/.luckyBackup/profiles/default.profile > luckybackup/.luckyBackup/logs/default-LastCronLog.log 2>&1

 

Link to comment
3 minutes ago, ich777 said:

Try this command:


docker exec -i --user luckybackup luckyBackup env DISPLAY=:0 /usr/bin/luckybackup --silent --skip-critical /luckybackup/.luckyBackup/profiles/default.profile > luckybackup/.luckyBackup/logs/default-LastCronLog.log 2>&1

 

Log:

"
/tmp/user.scripts/tmpScripts/AutoBackup/script: line 2: luckybackup/.luckyBackup/logs/default-LastCronLog.log: No such file or directory
"

Link to comment
6 minutes ago, REllU said:

Log:

"
/tmp/user.scripts/tmpScripts/AutoBackup/script: line 2: luckybackup/.luckyBackup/logs/default-LastCronLog.log: No such file or directory
"

Then I can't help and you won't see it in luckyBackup because this is basically the default command that is run by luckyBackup with a Cron schedule.

 

But at least you can run the backups on a schedule for now this way...

Link to comment
3 hours ago, ich777 said:

Then I can't help and you won't see it in luckyBackup because this is basically the default command that is run by luckyBackup with a Cron schedule.

 

But at least you can run the backups on a schedule for now this way...

 

I suppose it'd have something to do with permission issues (atleast that's what I was able to gather from a quick googling)

 

Either way, thank you for the amazing support! :)

I'll be doing my daily check-ins here to see when the issue gets fixed within the GUI

  • Like 1
Link to comment
1 minute ago, arturovf said:

Hello, the mega sync docker does not delete files on the cloud, after deleting them in the synched folder !

How do you sync or what did you do exactly?

I think you are deleting files on the share and it is not deleted in the cloud or am I wrong?

 

I now tried it and I can't reproduce this.

When I delete a file from the folder that I have on Unraid for the MegaSync container it also deletes them from the cloud.

  • Like 1
Link to comment
1 minute ago, ich777 said:

How do you sync or what did you do exactly?

I think you are deleting files on the share and it is not deleted in the cloud or am I wrong?

 

I now tried it and I can't reproduce this.

When I delete a file from the folder that I have on Unraid for the MegaSync container it also deletes them from the cloud.

it seems to happen only with CA_backup.tar.gz files and it's folder !  it used to work fine, I only realized its not because of the warning of full cloud space from mega

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.