IKWeb

Members
  • Posts

    114
  • Joined

  • Last visited

Everything posted by IKWeb

  1. Hello all I am new to Python and want to run some test scripts in a docker container. I am currently usying a website called Python Anywhere but I am finding myself hitting the CPU limits they have. What is a good docker that I can use to run scripts, and one better does any of them have a web GIU like Python Anywhere does?
  2. Basically I cant get the installer to work over HTTPS so I have to do it over HTTP then when I Switch it after it breaks the WP install. Once I install the container if I go to https://IPADDRESS:PORT the installer doesnt come up - but if I go to HTTP://IPADDRESS:PORT the installer works.
  3. Hello All I am after some help, as what ever I do ends up bricking my WP install and I end up removing it all and then having to start again. So I have NPM setup to forward to the IP and port I am using for the WP Docker, and to force SSL - Same settings as I would use for say NextClould. When I install WP I have to do it via the dockers IP over port 80, when I try and run the installer over HTTPS it errors. So when the install is done I end up with a wordpress site running but via port 80, and when I try and do https://IPADDRESS:PORT it fails. So I go back to http and login to WP. When I look to change any of these 2 URLS to either https://IPADDRESS:PORT or just IPADDRESS or even the external URL it just breaks the system. How the heck do I setup the external URL within WP, and switch it over to SSL so NPM can forward the traffic and work like NextCloud does... I am at a total loss. Thank you for anyone who can help / advise.
  4. Hello All I am after some help please - I am using Nginx Proxy Manager as my reverse proxy. All is fine and I have next cloud working fine. But when I try and download files greater than 1GB the connection gets reset. From what I have read online I need to add the below to my config of Nginx but I am unsure what file to add it to The command is - Proxy_buffering off; If anyone could let me know what file I need to add this to that would be great. Or if I am completely on the wrong path any feedback would be welcome. Thank you
  5. Hello All I am new to Crashplan and have just signed up to the basic $10 per month package of CrashPlan pro. I want to backup my files and use some kind of security to make sure they are safe. When I add the new destination I see in the tab an option for security - but I only get a toggle that askes me to use my password for opening the app. (screenshot below) If I want to add encryption to my files that are been uploaded - Where would I go to do this? Thank you
  6. Hello @Djoss thank you for taking the time to reply. When I look at advance settings I cant see what option would be to make that location writable. Are you able to advise, or let me if I can find some help docs only to read though? Many Thanks
  7. Hello All Quick question - I have re installed CrashPlan and set the storage location to /mnt/user/ When I try and do a restore even tho the share is open - CrashPlan says its read only. What am I missing? I assume a setting somewhere? Thanks
  8. @Hoopster so the share I am doing the above copies to is set to run from the cache. Then I assume move to the array over night. but those speeds writing to the cache is so bad. I would be happy to write to the cache at full speed then for it to take 50mb over night as the mover is invoked. but I though this was what happens when you tell the share to use the cache drive??
  9. Thanks both. So basically if you have more than say 10GB of data that you want to write to the server in one go. UNRAID is basically useless as it craps itself and takes what starts out at 300/400mbs and drops down to 50mbs 😮 Think it’s time to bin UNRAID for me and move to a solution that can take data that’s been written to it without crapping itself. thanks for taking the time tho for the above replies 👍
  10. Hello All Hope everyone is fine this Easter Weekend.. I am about at the end of my knowledge with UNRAID and got to the point where I am about to revert the server back to a Windows based OS (where it worked without issue for 3 years) but with UNRAID I am getting issues - and this is my hope in that someone could advise / help / suggest something. So I have a Dell Server, 100GB RAM, and 2 CPU's total over kill for a UNRAID server that is just sitting around and been a NAS - No VM's and a few docker images. I have connected at the moment 8 SATA HDD in servers backplane, and a new SSD connected as the cache drive. The server itself is still under Dell Warrantee and all hardware passes sell tests and I have also sent all hardware logs off to Dell and they have also confirmed no issue with any hardware The Issue So the issue I am getting is when I copy, or move data around on the server all stars off good I get around 350MB/s write to the server and all looking great. Then about 10GB into the copy it drops off and goes down to around 60MB/s (screenshot below) Whne I use Glances to take a look at whats going on I see that IOWait shoots up to 14% in the red and I get warnings or critical alerts showing up. UNRAID shows the CPU usage and while the copy is been done the CPU's are showing around 12% so its not like they are been pinned After the file copy has compleated - around 2min later everything goes back to normal. I have also found I can trigger warnings, if I go into docker and select the check for updates for all dockers running - this will give me CPU_IOWAI errors in Glances. The CPU's in this server are 2 x Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz and are both 8 core each - so in how may high end CPU's but more than enough to run what I am doing, and work 100% under a Windows OS and even running a number of VM's in Hyper V with huge data throughput. So this is where I am totally confused and I dont know where to go from here. I have also attached my diagnostic file if someone could take a look I would be very grateful. In case its a config issue I would also be open to contracting someone with good standing on the forum to work to fix the issues as my knowledge of the underlying system in UNRAID isn't great. From what I read on the UNRAID forums - this issue isn't just me. Lots of others have reported the same issue - but never found a fix. Could it be something within UNRAID itself? Bug maybe? TIA diagnostics-20210402-1334.zip
  11. Hello All Does anyone know / can confirm if Fusion IO Cards work under UNRaid? - Where I work we have just removed over 50 of these from our servers to aid the move to VMWare 7 and I am in the process of looking to replace my Cache drive. So I am thinking of dropping in a 2 x 1.6TB Fusion IO drives to act as the Cache drive. TIA
  12. Hi @jonathanm - it is an old SSD its an - Crucial CT960 - M500
  13. @RedReddington - Did you ever get a fix for this? I am having the same issue.
  14. Quick update - When I copy a large amount of data from one share to another - Both shares set to use Cache - the copy starts off around 300mb/s or above, but after 5min drops down to 20mb/s and IOWAIT hits 10% I am wondering if its the cache drive? But its showing no issues. The SSD does show this - So maybe this is the cause
  15. Evening all - So when I do anything with my active dockers I get CPU_IOWAIT errors showing up. (for example updating the containers) Posts on this forum say its a bug within unraid (but a few versions have been released between that post and now) but I also wanted to check with people that know more than me that my locations are correct. All my AppData paths witin the docker templates for things like Plex, Sonarr are all on the cache drive. But the docker.img looks like its on normal spinning disks (highlighted below) Do I need to update these paths to be the cache drive? Finally the behavour I am getting is when I copy a large amount of data to my cache drive, I loose all WebUI's for the server itself, and all dockers that have a webUI. until the copy is done. The copy is been done to a share that is on the cache drive (so its not copying direct to the array) These are the advance settings These are the errors I am getting
  16. @Exilepc - Did you ever find a fix for this? I am getting the same issues, and while copying large amounts of data to the array it kills the WebUI's for the server itself, and all the docker containers.
  17. No - That is setting up a VPN for when you are out and about and want to connect back into your network. What I am looking for is a docker that will run. I add my VPN settings to it connects. Then if I want to use the VPN I can add proxy settings to my PC etc of the IP address and port of that docker (which is connected to the VPN) and thus my traffic goes out over the VPN.
  18. Hello All This maybe a stupid question - But is there a docker image that I can download setup with my VPN info - that will then connect to the VPN I have added. Then what I would like to do is if I want to use that VPN but in proxy info say on my PC, of that container and then it will push my traffic out over the VPN? This isnt for Torrents etc this is so I can use services from outside the UK - So I dont need anything to cover for Torrents or NZB's etc Thank you
  19. IKWeb

    Plex Hardware

    Evening all so been using Plex for a very long time and it’s been mainly myself running it around the house but never used on more than 2 TVs at the same time. now I have some family members linked in to watch some stuff over lockdown. now I am not 100% sure if the cpu can take all the feeds. Maybe around 4/5 at a time. I know you can install a GPU to help but not done this before I am not 100% sure what gpu to look for. My setup is running on a cell R530. So I have to keep this in mind with the gpu as I can’t just shove in a huge card. -lex is also running as a docker on UNRAID so I assume will need drivers and pass it though to the docker for Plex to use. Any advise would be welcome thank you
  20. IKWeb

    CPU Usage

    Ahh Thank you for that @ChatNoir and also @JorgeB
  21. IKWeb

    CPU Usage

    Sorry @JorgeB what do you mean?
  22. Hello All Am I missing something here - WebUI is really slow as are my dockers - I finally manage to get the WebUI to open and it shows my CPU's running at 35% which I cant understand why that would stop the web UI from loading up. But when I run the TOP Command nothing is close to 35% CPU usage Am I missing something?
  23. Hi JorgeB - Yep that has now been done. - But still getting the same issues. CPUs will spike web GUI becomes totally unresponsive as does all web based docker containers. The web GUI will throw an internal 500 server error
  24. Im still getting forced shutdowns, so if someone could look at the diag reports I would be greatful if they could point me in the right direction. TIA