Jump to content

Capt.Insano

Community Developer
  • Content Count

    252
  • Joined

  • Last visited

Community Reputation

1 Neutral

About Capt.Insano

  • Rank
    Advanced Member

Converted

  • Gender
    Undisclosed

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks a million @binhex. I am away for the next week but will defo get on it once I am home!!
  2. I had actually tried that container prior to posting but I could not get my .opvn file to be correctly seen by the container, I kept getting errors in the log about no configuration file found. I do not want to be too off topic but, would you mind posting a picture of your configuration if you have it successfully working? Otherwise: @binhex do you have ant interest in making such a container with unRAID based settings?
  3. I was actually coming here to ask for the same thing! 1. Would it be possible to get a Privoxy/OpenVPN container without any associated app? 2. Is it possible to run a privoxy container on a static IP (Custom: br:0)? Anytime I try and set a static IP for a Privoxy/OpenVPN container the proxy fails when connected from my Win10 laptop. Thanks a million for all your hard work binhex!
  4. Marking this as solved! No problems since, Thanks a million johnnie.black!
  5. According to that post you linked it was to be fixed in kernel 4.14. Seeing as I am on unRAID 6.6.6 and running kernel 4.18 why it is still a problem? root@Tower:~# uname -a Linux Tower 4.18.20-unRAID #1 SMP Fri Nov 23 11:38:16 PST 2018 x86_64 Intel(R) Xeon(R) CPU X5675 @ 3.07GHz GenuineIntel GNU/Linux
  6. I will try that right away! Thanks so much. I will report back after a few days/week whether problem is fixed with about command.
  7. I have a HomeAssistant VM that often gets put into a "Paused" state meaning that all of my home automation ceases to work. I cannot figure out the likely cause of the pausing as my cache drive seems to have plenty of room. It started happening about 3 weeks ago and I made a big effort to watch cache usage but it is still happening. Attached is my diagnostics zip and below is some info I think may be relevant, I would really appreciate any help!! On my unRAID server: root@Tower:/mnt/cache/appdata/Emby# df -h Filesystem Size Used Avail Use% Mounted on rootfs 16G 1.1G 15G 7% / tmpfs 32M 316K 32M 1% /run devtmpfs 16G 0 16G 0% /dev tmpfs 16G 0 16G 0% /dev/shm cgroup_root 8.0M 0 8.0M 0% /sys/fs/cgroup tmpfs 512M 67M 446M 14% /var/log /dev/sda1 7.5G 973M 6.5G 13% /boot /dev/loop0 8.2M 8.2M 0 100% /lib/modules /dev/loop1 4.9M 4.9M 0 100% /lib/firmware /dev/md1 2.8T 2.0T 788G 72% /mnt/disk1 /dev/md2 2.8T 1.8T 975G 66% /mnt/disk2 /dev/md3 2.8T 1.6T 1.2T 58% /mnt/disk3 /dev/md4 2.8T 1.5T 1.4T 52% /mnt/disk4 /dev/md5 2.8T 1.4T 1.4T 51% /mnt/disk5 /dev/md6 2.8T 1.2T 1.6T 42% /mnt/disk6 /dev/sdg1 224G 80G 143G 36% /mnt/cache shfs 17T 9.3T 7.2T 57% /mnt/user0 shfs 17T 9.4T 7.3T 57% /mnt/user /dev/loop2 30G 15G 13G 54% /var/lib/docker /dev/loop3 1.0G 18M 904M 2% /etc/libvirt shm 64M 0 64M 0% /var/lib/docker/containers/5d3a91297475dc76e4452f5274219116b50cf8acc1e114d23408361ced25dfa3/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/986c16755882f961f47fca87257bcc956001fac4566246d0787af80d7c03ed6b/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/4db3070196a30a5fd4fd8640d6219253192dad50100df1e5024388a84d6d1b02/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/e014692a6bf6f5ca5f95520a96b531efa46f5393e26454220900428125c6184d/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/3fa248fbccfd5c167afd499f0bc7ac8ec8d95264db69d1ab3dd323411059fd9e/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/0ef4dd4ce03e13bc06ab64ef606286867d9a7af9416da1a8f05f96bd544dc7c9/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/c456e84c6473518d7f35e82d0c6b6d9e5e983e529b9128856f475fadd5582e07/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/478db1159d0542a224092517686a9631b5c98f5113c9aee1fdc5b284945da10c/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/ad0c3033a45e734d63239a1b90a6d669471e55a210a7bee27bc209196c3e95b9/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/0a371e1da920fe34d6ac978efdb919cbe09505d807c319c7ad1d9243da60bf4c/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/75a880a8c4aca6806df9c89aa39efad385e5421b39201d54d3b97bd3f33edff9/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/41d90b2ea7011c9edc0eaf5272311639123bfe9a5219c5b3bfc801da0806f8e1/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/63b844d285a7fa29edc697a92104bc7524f90f12586064bf5da62b5e0a7e32e5/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/076dc4418d8fec1b68d1fe4360d051085b202a0abf65890c6804726d4b2a834b/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/67d6957eb2cfc064653ce0a405addc74b3653ef5284dfa76c151991ebdc07588/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/4ab10bafc8140e7cf16eff894d7bde93ba9c21bd9153ed3cd824619ba6f0e036/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/938340d5cc2adb21249d1666fda047319244f65cfbc7ba35c55992a1c4333380/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/30f338122358484799e49218f6cbfaaef243394154af6fea6080ddd4b3384798/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/89a1ed26ad5a1036607c68648bdb1307cd7bedbd74b6da4af2fb6f2ac6f77f47/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/2000963f7424c204931d8e45b85e12156c86abce77866ca38bcd6b3246ef802e/mounts/shm shm 64M 4.0K 64M 1% /var/lib/docker/containers/106a9ec9ba5793b782f6d685acf7f9af4a22903e4971db4fc945317c07021196/mounts/shm shm 64M 4.0K 64M 1% /var/lib/docker/containers/e1545ec7c79ce36656bea2cacb9860322929eda7cd612dcd6b5efcb61a4a1539/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/a6ae209936e5cc61237b87d094ca03549d93a3a654a49f9cbbf5e01cd8c463db/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/62b46bfa70678933d4e8e48cd41c8fbdef6c4b59d43a8ef541100027546b81c7/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/1119f358d9848e41a824cb216806244495771f0c64cbfb2b1935a6a8fc2fe297/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/535b72b6aefceb7a918a2a4977134102bd3fd3570c43e24ce0cfb0ad87cd000a/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/c6854038d0874bd4f659c95fa4d2063f5305b2994b546c347fdb801c260f90fc/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/3b266e0276f44f13e272f683bbd7714de08dbcd4c20f2f44a10d3a59f6079658/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/92c01e87c1e5a0a9bc2419b10ac40f786d8fda5805bcc42a48226f3a1c29c09c/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/7a3f861d6a2645cb83834fe4e4c2c422761c7d6da86b7ce24f675e5fd323c923/mounts/shm shm 64M 0 64M 0% /var/lib/docker/containers/b0505370a3b3e10f921000a0caa165b215481464ce4922c39b0a299cd5c2e0ef/mounts/shm Inside the HomeAssistant VM: USER@HomeAssistant:~$ df -h Filesystem Size Used Avail Use% Mounted on udev 486M 0 486M 0% /dev tmpfs 100M 13M 88M 13% /run /dev/vda2 14G 2.5G 11G 20% / tmpfs 497M 0 497M 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock tmpfs 497M 0 497M 0% /sys/fs/cgroup /dev/vda1 511M 132K 511M 1% /boot/efi tmpfs 100M 0 100M 0% /run/user/1000 I will gladly provide any other information that may help. Thanks again for any help! tower-diagnostics-20181221-1810.zip
  8. Scratch that, Container is now updated to latest Tonido version! (tested on my server and running well here) The Capt.
  9. Hi lads, So sorry for the delay on this I have not been around and am only seeing this now. I am away this weekend but I have set a reminder to update this docker next week. Sorry again, The Capt.
  10. Apparently, I have been running it hourly for the last 315+ days!! You cannot blame me, it is a great plugin !! +1 This would be a great feature!
  11. Great plugin dmacias! I have been using it pretty much since launch, there in lies the problem!!: I recently moved house and moved ISP and I was eager to compare speeds and ensure no problems with my line: When I fired up the speedtest UI I realised that I have a total of 7579 results!! This stalls my browser (Firefox and Chrome) which prompts me to kill the script, if I allow the script to continue the webUI does load but it is impossible to navigate. I realise that I could just delete the speedtest.xml from /boot/config/plugins/speedtest but that would mean I would lose all of my past data. Would it be possible to set an option in the plugin to only keep X amount of weeks/months/years of results and then start deleting older ones, or even just keep X amount of results themselves? As an intermediate solution; I went through the xml and realised that the earlier the result in the file the earlier it was in time although none of the xml entries have a date stamp on them so I just deleted the first half of the xml entries. As always, thanks a million for your work in unRAID. The Capt.
  12. Good suggestion Probably best solution but TBH a little overkill for my original plans! I thought it would be easy to just put an RDP password on it! Thanks
  13. I have been looking into a Docker based WebUI file management solution for my server and I have started using Krusader by Sparklyballs. My Question: Is it possible to password protect an RDP based Docker app? I have looked around online and changed some of the xrpd.ini settings with no success: Attempt 1: I changed the xrpd.ini to specify a username and password and restarted the container but still gucamole does not prompt me for a password and connects to the container. [xrdp1] name=Krusader lib=libxup.so username=<SomeUsername> password=<SomePassword> ip=127.0.0.1 port=/tmp/.xrdp/xrdp_display_1 chansrvport=/tmp/.xrdp/xrdp_chansrv_socket_1 xserverbpp=16 code=10 Attempt 2 I changed the xrpd.ini to ask for a username and password and restarted the container, this time gucamole asks me for a password but any entry in username and password allows entry into the container. (even leaving username/password blank allowed entry to container) [xrdp1] name=Krusader lib=libxup.so username=ask password=ask ip=127.0.0.1 port=/tmp/.xrdp/xrdp_display_1 chansrvport=/tmp/.xrdp/xrdp_chansrv_socket_1 xserverbpp=16 code=10 Is there anyway to secure it? Thanks for any help!
  14. It is defined in the template as container port 8080 to host port 9876. I was getting permission denied issues if trying to run it on port 80.