[Support] ich777 - Application Dockers


ich777

Recommended Posts

7 minutes ago, shpitz461 said:

Is there an encrypted Linux fs that can be read on Windows?

I really don‘t know but look into Paragon Linux Filesystem, it‘s a paid software and it works really well but I don‘t know if it will work with encryped drives… you can try at least, they have a 30 day trail I think.

  • Upvote 1
Link to comment

Hi! I a newbie! looking for help.

i been trying to install Mariadb in order to create a database for photoprism. I have follow the video guide of Ibracorp but i get stucked in mariadb. I get this error: 

 

docker run
  -d
  --name='mariadb'
  --net='bridge'
  -e TZ="America/Los_Angeles"
  -e HOST_OS="Unraid"
  -e HOST_HOSTNAME="Depa14"
  -e HOST_CONTAINERNAME="mariadb"
  -e 'MYSQL_ROOT_PASSWORD'='redacted'
  -e 'MYSQL_DATABASE'='depa14db'
  -e 'MYSQL_USER'='depa14'
  -e 'MYSQL_PASSWORD'='redacted'
  -e 'REMOTE_SQL'='http://URL1/your.sql,https://URL2/your.sql'
  -e 'PUID'='99'
  -e 'PGID'='100'
  -e 'UMASK'='022'
  -l net.unraid.docker.managed=dockerman
  -l net.unraid.docker.icon='https://raw.githubusercontent.com/linuxserver/docker-templates/master/linuxserver.io/img/mariadb-logo.png'
  -p '3306:3306/tcp'
  -v '/mnt/user/appdata/mariadb':'/config':'rw' 'lscr.io/linuxserver/mariadb'

Unable to find image 'lscr.io/linuxserver/mariadb:latest' locally
docker: Error response from daemon: Get "https://ghcr.io/v2/": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers).
See 'docker run --help'.

The command failed.

 

Thank you in advanced.

Edited by Roberto Ornelas
Link to comment
40 minutes ago, Roberto Ornelas said:

docker: Error response from daemon: Get "https://ghcr.io/v2/": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers).

as you see your server cant reach the repo ...

 

any pihole's, adblockers, ... running in LAN ?

 

may try to reach https://ghcr.io from another client to see if you can reach the site

 

and btw. may post in the proper support thread, this is not lsio mariadb ... and its not related to photoprism now ;)

  • Thanks 1
Link to comment

For Chromium is there some way to enable remote debugging? I've tried using the flag:

--remote-debugging-port=9222

but it says its not valid for some reason. I'm desperately trying to get something to work that requires a login and these are the only non-headless chromium dockers I can find. Help would be very appreciated.

 

 

Link to comment
7 minutes ago, ich777 said:

I‘ll have to look into it.

On what Unraid version are you?


I don‘t have issues runnig a little python3 script.

 

How have you installed Python3?

6.11.5 - running it on bare metal works fine but the two dockers I tried have an issue with python3. (complaining about glibc). Installed python `sudo apt install python3` - something like that

Link to comment
10 minutes ago, ich777 said:

You have to add it at the Extra Parameters at the Variable not on top with Advanced View.

Thank you! That got it installed.

 

The remote debugging doesn't seem to work though. With chromedp/headless-shell I was able to reach IP:9222/json/version from other machines and dockers, and I can't with this so far. I can reach localhost:9222/json/version from withing the chromium instance itself, but it doesn't seem to be accessible from anything else.

Link to comment

Apparently the "--remote-debugging-address" flag only works for headless chromium. I'm guessing that's why I could connect to chromedp but not this.

 

So I guess i'm stuck in a catch-22. I can have GUI chromium where I can login to the site I need, but can't connect to the session from another docker. Or I can have headless accessible from another docker, with no possible way to login to the site I need.

 

🤬 😭

 

 

  • Thanks 1
Link to comment
23 hours ago, Econaut said:

6.11.5 - running it on bare metal works fine but the two dockers I tried have an issue with python3. (complaining about glibc). Installed python `sudo apt install python3` - something like that

I'm not using 6.11.5 here on my server because it's something newer but usually you don't have to install python3 in the container because it's installed right OOB, I've now tried it again and it is working over here:
grafik.png.cdb530b9703f44b3207410c124d2bf43.png

 

 

I have now the confirmation from another user who is on 6.11.5 that it is working:

grafik.png.52d44f0dfaf852fd307720f067f8ac28.png

Link to comment
8 hours ago, Goldmaster said:

Just a quick question, for some reason when I press the web ui globe icon, the megasync docker container opens to port 8080. Even though I have changed to a completely different port. I can't remember how I was able to access the web ui. I have tried custom docker network and bridge.

This has usually something to do with browser caching and if you change the port in the template multiple times.

 

Have you yet tried to change the port in the URL to connect to it and if you do so I assume it is working correct?

Link to comment
1 hour ago, ich777 said:

tried to change the port in the URL to connect to it and if you do so I assume it is working correct

Yes I have, I think its something to do with the docker container using the default https port that unraid uses, so there is a conflict.

 

Update

The network setting has to be Custom : br0 and then just put in a port value. Not sure why the docker opens at port 8080 still?

 
Edited by Goldmaster
Link to comment
59 minutes ago, Goldmaster said:

The network setting has to be Custom : br0 and then just put in a port value. Not sure why the docker opens at port 8080 still?

Because well you are now on a custom IP and it will open all ports like if you have a PC on your local network and it will use port which is used in the container.

 

In br0 the port mappings will simply do nothing.

Link to comment

Hi @ich777

Im hoping you could perhaps help me or point me in the right direction 🙂

Im using Luckbackup to clone some of my Unraid shares from from my Unraid Server 1 to Unraid Server 2 (all inside the same home network) Im loving it and its working really well except, each time Luckybackup copies a file from Server 1 to Server 2, the owner of that new file is set as "UNKNOWN" in Unraid.

 

For some reason its not cloning the Owner of that file and im not sure what im doing wrong or if there is a setting somewhere that i should use. I have tried the option of "use numeric group and user IDs" but i get the same result.

 

My config and settings are below for info. any help would be much appreciated.

image.thumb.png.0dba8a8af33c348017749960dac2f90d.png

 

image.thumb.png.2672721c8a9a9216e5a2bfc5aaf30218.png

Link to comment
12 minutes ago, Titan84 said:

For some reason its not cloning the Owner of that file and im not sure what im doing wrong or if there is a setting somewhere that i should use. I have tried the option of "use numeric group and user IDs" but i get the same result.

Did you do anything custom or better speaking does the user which has permissions to the files exist on the second server?

Anyways, this shouldn't harm anything because if you copy the file back over to the first machine the files should have the correct owner again.

 

Hope that makes sense.

  • Thanks 1
Link to comment
1 hour ago, ich777 said:

Did you do anything custom or better speaking does the user which has permissions to the files exist on the second server?

Anyways, this shouldn't harm anything because if you copy the file back over to the first machine the files should have the correct owner again.

 

Hope that makes sense.

Thanks for replying @ich777

No, nothing custom at all on the second server. It's a new build and that user has permissions to the files which exist on the second server so a bit strange.

All good, thanks for explaining about what happens when you copy the file back to the first server, i gave it a test and it does indeed show the correct owner again which is perfect because as long as i can restore things back to how things were on the first server then thats perfect for me :-) Thanks so much again.

  • Like 1
Link to comment

What does this mean in luckybackup?

 

I am syncing just over 2tb of stuff and had to leave overnight, I came to take a look at the progress and see this after leaving luckybackup running for several hours.

 

2238700 files...
<a name="error1"></a><font color=red>The process reported an error: "Crashed"</font>

 

So i have start the whole building file list again. Never knew lucky backup to take this long when syncing around the same amount. I'm wondering weather to switch to dirsync pro as that can scan, then compare, similar to goodsync.

Edited by Goldmaster
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.